A quick look at Microsoft Surface

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A couple of weeks back I managed to get a close look at a Microsoft Surface table. Although Surface has been around for a while now, it was the first time I’d been “hands on” with one and, considering it’s really a bunch of cameras, and a PC running Windows Vista in a cabinet a bit like a 1980s Space Invaders game, it was actually pretty cool.

One thing I hadn’t appreciated previously is that Surface uses a totally different technology to a multitouch monitor: rather than relying on capacitance, the surface table is sensitive to anything that reflects or absorbs infra red light. It uses an infrared emitter and a series of cameras to detect light reflected by something on the surface, then processes the image and detects shapes. There’s also an API so that software can decide what to do with the resulting image and a DLP projector to project the user interface on the glass (with an infrared filter so as not to confuse the input system). At the moment, the Surface display is only 1024×768 pixels but that didn’t seem to be restrictive in any way – even with such a physically large display.

Although in some ways surface behaves like a touch device as it has multiple cameras so it can perform stereoscopic three dimensional gestures but, because it lacks direct touch capabilities, there is no concept of a hover/mouse-over. Indeed the surface team’s API was taken and extended in the Microsoft .NET Framework version 4 to work with Window Touch and, at some point in the future, the Surface and Windows Touch APIs will converge.

The surface technology is unable to accommodate pressure sensitivity directly but the underlying processor is just a PC and has USB ports so peripherals could be used to extend the available applications (e.g. a fingerprint reader, card reader, etc.)

Surface can also recognise the type of object on the glass (e.g. finger, blob, byte tag) and it returns an identifier along with X and Y co-ordinates and orientation. When I placed my hand on the device, it was recognised as five fingers and a blob. Similarly, objects can be given a tag (with a value), allowing for object interaction with the table. Surface is also Bluetooth and Wi-Fi enabled so it’s possible to place a device on the surface and communicate with it, for example copying photos from the surface to a phone, or exchanging assets between two phones via the software running on the table. Finally, because Surface understands the concepts of flick and inertia, it’s possible to write applications that make use of this (such as the demonstration application that allows a globe to be spun on the surface display, creating a rippled water effect that it feels like you are interacting with, simulating gravity, adding sprung connections between items on the display, or making them appear to be magnetic.

One technology that takes this interaction even further (sometimes mistakenly referred to as Surface v2) is Microsoft’s SecondLight, which uses another set of technologies to differentiate between the polarisation properties of light so images may be layered in three dimensions. That has the potential to extend the possibilities of a Surface-like device even further and offer very rich interaction between devices on the Surface.

At present, Surface is only available for commercial use, with a development SKU offering a 5-seat license for the SDK and the commercial unit priced at £8,500. I’m told that, if a developer can write Windows Presentation Foundation (WPF) they can write Surface applications and, because Surface runs WPF or XNA, just as an Xbox or a PC does, it does have the potential for games development.

With touch now a part of the operating system in Windows 7, we should begin to see increasing use of touch technologies although there is a key difference between surface and Windows Touch as the vertically mounted or table form factor affects the user interface and device interaction – for example, Surface also detects the direction from which it is being touched and shows the user interface in the correct orientation. In addition, Surface needs to be able to cope with interaction from multiple users with multiple focus points (imagine having multiple mice on a traditional PC!).

My hour with Surface was inspiring. The key takeaways were that this is a multi-touch, multi-user, multi-directional device with advanced object interaction capabilities. Where it has been used in a commercial context (e.g. AT&T stores) it has mostly been a novelty; however there can be business benefits too. In short, before deploying Surface, it’s important to look further than just the hardware costs and the software development costs, considering broader benefits such as brand awareness, increased footfall, etc. Furthermore, because Surface runs Windows, some of the existing assets from another application (e.g. a kiosk) should be fairly simple to port to a new user interface.

I get the feeling that touch is really starting to go somewhere and is about to break out of its niche, finding mainstream computing uses and opening up new possibilities for device interaction. Surface was a research project that caught Bill Gates’ attention; however there are other touch technologies that will build on this and take it forward. With Windows Touch built into the operating system and exciting new developments such as SecondLight, this could be an interesting space to watch over the next couple of years.

Apple’s new multitouch mouse misses the point

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last week Apple updated its product line, ahead of Microsoft’s Windows 7 launch, and one of the new announcements was a replacement for the “Mighty Mouse”, which was quietly killed off a few weeks back after years of doing anything but living up to its name (as Adam Pash notes in Lifehacker’s coverage of Apple’s new lineup).

I first heard about Apple’s new “Magic Mouse” on Twitter:

“RT @real_microsoft: RT @Mirweis Once again #Apple seems to have nosed ahead of #Microsoft with the multitouch mouse: ”

[@michaelsfo]

and Apple’s latest mouse is a multitouch device that uses gestures to control the screen. As should be expected, it looks great but, as TechRadar reported, it doesn’t support a key gesture – the pinch zoom that we first saw on the iPhone and that Apple has made synonymous with multitouch through its advertising.

Furthermore, there’s no touch screen on any of Apple’s refreshed line-up. In fact, the iMac changes are mostly evolutionary (and there’s a new unibody entry-level MacBook). Meanwhile, with the launch of Windows 7, Microsoft now has advanced touch capability available within the operating system. A multitouch mouse is cool – seriously cool – but the real advantages of touch come with touch screens and other displays that take concepts like the Microsoft Surface table into mainstream computing uses.

Some people might not think touch is really a big deal, or that it’s just a bit gimmicky right now – but step back and take a look at what’s happened with smartphones: in 2007, Apple launched the iPhone and all we’ve seen since then is an endless stream of competing devices – each with multitouch capabilities. Now that’s crossing over into the PC marketplace and, unlike tablet PCs, or early Windows Mobile devices, there’s no need for a stylus and that’s why I believe touch will become much more signifcant that it has been previously. Only yesterday, I watched my young sons (both of whom are under 5) using one of Ikea’s play kiosks and they instantly knew what to do to colour in a picture on screen. As soon as prices drop, I’ll be buying a multitouch monitor for them to use with a PC at home as I expect touch to replace the mouse as the interface that their generation uses to access computing devices.

Far from nosing ahead of Microsoft, I believe Apple has missed the point with its new mouse (please excuse the, entirely accidental, pun). Just as in the years when they insisted that mice only needed a single button (indeed, one of the problems that made the Mighty Mouse so unreliable was that it offered all the functionality of a multi-button mouse with several contact switches under a single button shell in order to maintain the appearance of a single-button mouse), now they are implementing touch on trackpads and mice, rather than on screen. Sure, fingerprints on glass don’t look good but that hasn’t held back the iPhone – and nor would it the iMac or MacBook if they implemented multitouch on screen. For now, at least, Apple is holding off on touchscreen displays, whilst mainstream PC manufacturers such as Dell are embracing the potential for multitouch applications that the latest version of Windows offers. As for the criticism that multitouch monitors are spendy and Apple’s mouse is not, the monitors will come down in price pretty quickly and, based on my experience with Apple’s previous mouse, I won’t be rushing out to spend £55 on the latest model.

As it happens, I bought a mouse to match my white MacBook a couple of weeks ago. Ironically, its from Microsoft – the Arc mouse – and it manages to look good, feel good, and fold up for transportation with its (tiny) transponder neatly connected (with a magnet) to the underside. It seems that Jonathan Ive is not the only person that can design functional and stylish computer hardware (most of the time).

Adventures with Intel Virtualization Technology (VT)

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A couple of weeks ago, David Saxon and I ran a Windows 7 Skills Update workshop for some of our colleagues, based on a course obtained from the Microsoft Partner Training and Readiness Resource Center.  My plan was to use David’s excellent training skills to deliver the course (which I recorded), before he left the organisation to take up a new challenge.  Ironically, working for an IT company means that it’s not always easy to get hold of kit for labs and David called in a number of favours in order to get hold of 8 brand new PCs and monitors for us to run the labs.  Each machine was supplied with a quad core CPU and 8GB of RAM but, when we tried to enable the Hyper-V role in Windows Server 2008 R2, it failed because these computers didn’t support Intel’s Virtualization Technology (VT).

“No VT?”, I said “But these are Intel Core2Quad processors… ah…” – I remembered seeing something about how some Core2Quads don’t provide Intel VT support, even though the Core2Duos do.  These were the Q8300 2.5GHz chips and, according to an Intel document, the specification was changed in June to correct this and enable the VT.

I should have known better – after all, I’m an MVP in Virtual Machine technology – but I put my hands up, I didn’t check the specifications of the machines that David had ordered (and anyway, I would have expected modern CPUs to include VT).  Mea Culpa.

As the PCs had been manufactured in August, I thought there was a chance that they used the new CPUs but did not have BIOS support for VT.  If that was the case, it may have been possible to enable it (more on that in a moment) but running both CPU-Z and Securable confirmed that these processors definitely didn’t support VT.

In this case, it really was a case of the CPU not providing the necessary features but there are also documented cases of PCs with VT not allowing it to be enabled in the BIOS.  Typically the OEM (most notably Sony) claims that they are consumer models and that VT is an enterprise feature but with Windows 7’s XP Mode relying on Virtual PC 7, which has a dependency on Intel VT or AMD-v, that argument no longer holds water (XP Mode is definitely a consumer feature – as it’s certainly not suitable for enterprise deployment, regardless of Microsoft’s Windows 7 marketing message around application compatibility).

However, with a little bit of perseverance, it may be possible to force VT support on PCs where the functionality is there but not exposed in the BIOS.  Another friend and colleague, Garry Martin, alerted me to a forum post he found where a utility was posted to enable VT on certain Fujitsu machines that have been restricted in this way.  I should say that if you try this, then you do so at your own risk and I will not accept any responsibility for the consequences.  Indeed, I decided not to try it on my problem machines because they were a different model and also, I didn’t fancy explaining to our Equipment Management team how the brand new PCs that we’d borrowed for a couple of days had been “bricked”.  In fact, I’d think it highly unlikely that this tool works on anything other than the model described in the forum post (and almost certainly not with a different OEM’s equipment or with a different BIOS).

Incidentally, Ed Bott has reasearched which Intel desktop and mobile CPUs support VT and which do not.  As far as I know, all recent server CPUs (i.e. Xeon processors) support VT.

Windows 8 predictions

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Just in case you were wondering if the Windows client has a future after Windows 7 (it does), several Internet news sites are reporting that a Microsoft employee accidentally leaked details of his work on future Windows versions on his LinkedIn profile.  According to Gizmodo, Microsoft Research employee Robert Morgan carelessly left the following details in full public view:

“Working in high security department for research and development involving strategic planning for medium and longterm projects. Research & Development projects including 128bit architecture compatibility with the Windows 8 kernel and Windows 9 project plan. Forming relationships with major partners: Intel, AMD, HP, and IBM.

Robert Morgan is working to get IA-128 working backwards with full binary compatibility on the existing IA-64 instructions in the hardware simulation to work for Windows 8 and definitely Windows 9.”

It’s no secret that there will be a Windows 8 – Microsoft has already publicly committed to a new release in 3 years’ time; however anyone working in a “high security” role would be unwise to leave details of their work on a social networking site!

For what it’s worth (I know nothing at this time… but when I do, I’m sure it will be under NDA so I should write it down now!), I would expect 64-bit computing to be mainstream on the client in the Windows 8 timeframe (and if you’re not considering it for Windows 7, then you should), and would only expect 128-bit to be relevant for high-end server versions (note that the quote above refers to IA-64 and IA-128 – so that’s Itanium rather than some new “x128” desktop hardware).  I’d also expect tighter integration with the cloud, and further developments in the area of boot from VHD, to further decouple the operating system from the hardware.

Of course, all of this is pure speculation on my part.

Some say he has silicon innards and eats data for breakfast…

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

USB StigFans of the BBC’s Top Gear programme may be interested in a novelty “USB memory Stig” that goes on sale today… I can’t take credit for the pun or the title of this post (both were ripped off from a comment on the product page at play.com) but if you know anyone that fancies a racing driver 8GB USB flash drive then this might be worth a look.

(And yes, I know that Top Gear is no longer really a motoring programme and more about a bunch of middle-aged guys messing around in cars and generally failing to grow up… but that’s exactly the point! It seems to me that Messrs. Clarkson, May and Hammond have, quite possibly, the best jobs in the world.)

Drive speeds for ATA, USB Flash, SDHC and CF

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Guest Post[In recent weeks, there have been a number of posts on this blog looking at the Hyper-V developer workstation proof of concept (booting from a .VHD on a flash drive) that I knocked up for my colleague Garry Martin. This post is slightly different in that Garry did the legwork and provided me with some notes to publish in his name. So, with a little editorial input from yours truly, here are Garry’s notes on drive speeds for ATA, USB Flash, SDHC and CF.]

Card speed is usually specified in “x” ratings. This gives the data rate as a multiple of the data rate of the first CD-ROMs which ran at 150kB/s. Thus a 133x SDHC card is running at 133 * 150kB/s = ~20MB/s.

Most premium SDHC cards on the market today run at 200x (30MB/s) although a few 233x (~35MB/s) cards have started to appear.

In contrast, most premium CF cards on the market today run at 300x, or ~45MB/s, roughly equivalent to a modern 80GB laptop hard drive.

The faster USB flash drives currently run at somewhere between 200x and 233x (30MB/s and 35MB/s). Examples are the Corsair Flash Voyager GT 4GB, the OCZ ATV Turbo 4GB and the Lexar JumpDrive Lightning 4GB.

Looking at pricing for the various media types:

Device Approximate price to purchase (UK prices, August 2009)
SanDisk Extreme III SDHC 4GB (200x) £20
SanDisk Extreme III SDHC 8GB (200x) £50
SanDisk Extreme III 4GB CompactFlash (200x) £30
Corsair Flash Voyager GT 4GB (233x) £35
OCZ ATV Turbo 4GB (233x) £35
Lexar JumpDrive Lightning 4GB (200x) £35

[I’m still playing around with flash for my USB boot scenario (especially as an SDHC card will sit nicely in my notebook PC’s card reader slot) but, ironically, for a project that started out looking at booting from a USB flash drive, we will probably settle on the use of external hard disks. This isn’t for reasons of performance but because the internal disks that would have stored the VMs are encrypted – whereas with a USB-bootable hard disk we can store the VHDs for our VM workloads and the parent partition’s bootable VHD. (BitLocker and hibernation are two Windows features that boot from VHD does not support.)]

Hackintosh netbook revisited

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Hackintosh Finder Icon by ~3ncA few months ago, I wrote about the installation of Mac OS X on my Lenovo IdeaPad S10e netbook. Whilst I was pleased to have a working installation of OS X there were still a few things that didn’t quite work as I’d have likef. This post details a few more tweaks I’ve made to the Hackintosh.

My S10e is a Hackintosh, rather than a Macintosh, so I replaced the standard Mac OS X Finder icon with the Hackintosh Finder Icon by ~3nc using LiteIcon.

I thought that the fans weren’t running as often as they had been under Windows… in fact I’m not even sure they were running at all. Furthermore, iStatMenus would only tell me the hard disk temperature so I wasn’t sure how warm the CPU was running, or how fasts the fans were turning. Thankfully, before I fried my netbook, a comment on this blog pointed me back to The Kitch and ultimately to a post on the Lenovo IdeaPad S Series Forums which linked to an updated version of AppleACPIPlatform.kext, which I then installed using Kext Helper. After a reboot, my fans have been running to keep the netbook cool(er), although it’s still pretty hot and I seem to have lost Bluetooth.

I had a play with a few options to scale the screen resolution; however the results were not really fantastic. I did eventually settle on using defaults write NSGlobalDomain AppleDisplayScaleFactor 0.96 to make the screen appear to be 600 pixels deep but some of the icons (e.g. the battery on the menu bar) were screwed up.

I also have a UK keyboard, so I followed Liquid State’s advice, using Ukelele‘s LogitechU.K.Intl.keylayout (copied to /Library/Keyboard Layouts and selected in the International system preferences) and then adjusting the modifier keys as described by Phil Gyford (alternatively, I could have swapped the Windows key and the alt key to keep them the same way around as on a Mac keyboard). Incidentally, Apple keyboards still have the ” and @ reversed (even with a UK layout) but at least with this configuration the labels on the keys matched the resulting output.

The biggest letdown was Ethernet connectivity. There was a project working on porting the Broadcom BCM57xx and 59xx Linux drivers to OS X but nothing is happening fast and it really seems to be one guy working with limited spare time and limited collaboration. Wireless is fine but wired Ethernet is more reliable (and often the only option in a hotel room) so this was probably the final nail in my Hackintosh’s coffin.

Now the S10 has been replaced by the S10-2 and Gizmodo reports that it’s not really suitable for hackintosh conversion. My Hackintosh was a fun experiment but ultimately I’m not finding it as useful as I would if it was running Windows. It’s not that there is anything wrong with Mac OS X but I use Macs for my digital media work and a netbook is not really the right computing platform for that. In addition, I’m missing out on things like reliable Bluetooth, sleep, and Ethernet connectivity – all of which I could get in a Mac… if I was prepared to pay the money. Let’s see if the Apple iPod tablet really does make it to market this winter.

In a few hours, I’ll take a final disk image of the Hackintosh for posterity and rebuild it to run the final release of Windows 7 (thanks to Microsoft for my complementary copy) – which is, after all, what I originally bought it for!

Looking for a 64-bit notebook to run a type 1 hypervisor (Hyper-V… or maybe XenClient?)

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Earlier today, some contacted me via this website and asked for advice on specifying a 64-bit laptop to run Hyper-V. He was confused about which laptops would actually work and I gave the following advice (unfortunately, my e-mail was rejected by his mail server)…

“The main thing to watch out for is the processor specification. If you get the model number (e.g. T7500) and look this up on the Intel website you can see if it has a feature called EM64T or Intel 64 – that is Intel’s implementation of AMD’s x64 technology – and most PCs have it today. Other things you will need for Hyper-V are hardware DEP (Intel XD or AMD NX) and hardware assisted virtualisation (Intel-VT or AMD-V). This last one might catch you out – some quad core chips don’t have the necessary functionality but most dual core chips do (and I’ve heard some reports from people where the chip supports it but there is no option to enable it in the BIOS).

Also, if you’re running 64-bit, driver support can be a pain. Stick with the major manufacturers (Lenovo, Dell, HP) and you should be OK. I was able to get all the drivers I needed for my Fujitsu notebook too.”

If you want to run Hyper-V on a notebook, it’s worth considering that notebook PCs typically have pretty slow hard drives and that can hit performance hard (notebook PCs are not designed to run as servers). Despite feedback indicating that Virtual PC does not provide all the answers, Microsoft doesn’t yet have a decent client-side virtualisation solution for developers, tech enthusiasts and other power users but Citrix have announced something that does look interesting – the XenClient (part of what they call Project Independence), described as:

“[…] a strategic product initiative with partners like Intel, focused on local virtual desktops. We are working together to deliver on our combined vision for the future of desktop computing. This new virtualization solution will extend the benefits of hosted desktop virtualization to millions of mobile workers with the introduction of a new client-side bare metal hypervisor that runs directly on each end user’s laptop or PC.”

You can read more at virtualization.info – and it’s probably worth watching the last 15 minutes from the Synergy day 2 keynote (thanks to Garry Martin for alerting me to this).

Layered on top of XenClient are the management tools to allow organisations to ensure that the corporate desktop remains secure, whilst a personal desktop is open and the scenario where we no longer have a corporate notebook PC (and instead are given an allowance to procure and provide our own IT for work and personal use) suddenly seems a lot more credible. I’m certainly hoping to take a closer look at the XenClient, once I can work out how to get hold of it.

Windows 7 edges closer to release

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I keep saying that I don’t really do news here… but I’m excited about Windows 7 I heard a whisper that Microsoft was going to announce ship dates at a conference in Taiwan tomorrow (thanks Dave). I also heard via the OEM channel that there would be a programme for Windows Vista to 7 upgrades on new PCs purchased this summer… which was supposed to be top secret but that’s been announced too.

According to the Windows Team Blog (breaking Microsoft’s own embargo on this news…) Windows 7 and Windows Server 2008 R2 will RTM in July and Windows 7 will be in stores for 22 October. The blog post also confirmed that plans are in the works for a Windows 7 upgrade option program (with more details expected soon).

Other Windows 7 related developments in recent days include that:

I'm a PC - and I'm running Windows 7Notice that I said “when my netbook is rebuilt” – I may be playing with OS X on my S10e but that’s just a geek project and I expect it to be a Windows 7 machine again soon. Perhaps more significantly, my everyday notebook PC (upon which I depend to do my work) is already running the RC build of Windows 7 Ultimate Edition (64-bit). I may be reticient to say “I’m a PC” (I also use Macs and Linux at home) but the accompanying graphic has appeared on a few e-mails in my Inbox this week and, at least for work, it’s entirely appropriate for me.

Windows 7 Starter Edition: let’s put it into perspective

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

There seem to be a number of sites linking to a prominent “news” site that claims Windows 7 will be “crippled” on netbooks but… WTF? Are they serious, or just posting link bait?

Back in February, Microsoft announced the various editions that Windows 7 will be available in, including Starter Edition, which will only be offered pre-installed by OEMs and is recommended for price-sensitive customers with small notebook PCs.

Basically, that sounds like a low cost version for netbooks. – and key features were listed as:

  • Broad application and device compatibility with up to 3 concurrent applications.
  • Safe, reliable, and supported.
  • Ability to join a Home Group.
  • Improved taskbar and JumpLists.

Now someone has stirred things up and headline-grabbing tech-“journalists” (I use the term lightly… these are not the Mary-Jo Foleys, Ed Botts, or Paul Thurrotts who actually look at the technology and when researching stories but consumer-focused writers with a few press releases and 500 words to churn out for an editor who wants nothing more than a good headline) are saying how this will kill Windows 7 sales and open the netbook market to Linux. Yawn. Have I suddenly fallen foul of a cross-site scripting exploit and ended up reading Slashdot, or The Register? Nope. It seems I am still reading Computerworld, a site that seems to think words like Ed Bott or ZDNet turn my comment into spam!

It’s the three application limit that seems to have people up in arms but, according to Paul Thurrott in episode 103 of the Windows Weekly podcast and Ed Bott’s recent post on living with the limits of Windows 7 Starter Edition, the three application limit is not triggered by things like Explorer windows, Control Panel applets, system utilities or gadgets – this is three applications – not three Windows!

And, as I wrote when I bought one a few months back, netbooks are not for content creation but for ultra-mobile content consumption. You’re not going be doing much on a 10″ screen with a tiny keyboard! Not unless you want to end up with a bad repetitive strain injury.

Mary-Jo Foley reminds us that Home Premium is the default consumer version of Windows 7 – not Starter Edition. Who says that netbook OEMs will not provide Home Premium for those who want it?

Meanwhile, Ed Bott made a very good point when he wrote “Is this a netbook or a notebook? If the answer is netbook, you might be pleasantly surprised at what this low-powered OS can actually accomplish” but he also notes that, if he tried to use it as a conventional notebook, he “would probably be incredibly frustrated with the limitations of Starter Edition.” And Laptop magazine wisely commented that any comment has limited value until we know the price difference between a netbook with Windows 7 Starter Edition and the same netbook with Windows 7 Home Premium, a view which Mary-Jo Foley also puts forward in her post.

To me, it’s simple:

If I was a betting man, I’d wager that most netbook users fall into the latter category.