Microsoft infrastructure optimisation

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Infrastructure optimisationI don’t normally write about my work on this blog (at least not directly) but this post probably needs a little disclaimer as, a few months ago I started a new assignment working in my employer’s Microsoft Practice and, whilst I’m getting involved in all sorts of exciting stuff, it’s my intention that a large part of this work will involve consultancy engagements to help customers understand the opportunities for optimising their infrastructure. Regardless of my own involvement in this field, I’ve intended to write a little about Microsoft’s infrastructure optimisation (IO) model since I saw Garry Corcoran of Microsoft UK present at the Microsoft Management Summit highlights event back in May… this is a little taster of what IO (specifically Core IO) is about.

Based on the Gartner infrastructure maturity model, the Microsoft infrastructure optimisation model is broken into three areas around which IT and security process is wrapped:

  • Core infrastructure optimisation.
  • Business productivity infrastructure optimisation.
  • Application platform infrastructure optimisation.

Organisations are assessed on a number of capabilities and judged to be at one of four levels (compared with seven in the Gartner model):

  • Basic (“we fight fires” – IT is a cost centre) – an uncoordinated, manual infrastructure, knowledge not captured.
  • Standardised (“we’re gaining control” – IT becomes a more efficient cost centre) – a managed IT infrastructure with limited automation and knowledge capture.
  • Rationalised (IT is a business enabler) – managed and consolidated IT infrastructure with extensive automation, knowledge captured for re-use.
  • Dynamic (IT is a strategic asset) – fully automated management, dynamic resource usage, business-linked SLAs, knowledge capture automated and use automated.

Infrastructure optimisation overview diagramIt’s important to note that an organisation can be at different levels for each capability and that the capability levels should not be viewed as a scorecard – after all, for many organisations, IT supports the business (not the other way around) and basic or standard may well be perfectly adequate but the overall intention is to move from IT as a cost centre to a point where the business value exceeds the cost of investment. For example, Microsoft’s research (carried out by IDC) indicated that by moving from basic to standardised the cost of annual IT labour per PC could be reduced from $1320 to $580 and rationalisation could yield further savings down to $230 per PC per annum. Of course, this needs to be balanced with the investment cost (however that is measured). Indeed, many organisations may not want a dynamic IT infrastructure as this will actually increase their IT spending; however the intention is that the business value returned will far exceed the additional IT costs – the real aim is to improve IT efficiencies, increase agility and to shift the investment mix.

Microsoft and its partners make use of modelling tools from Alinean to deliver infrastructure optimisation services (and new models are being released all the time). Even though this is clearly a Microsoft initiative, Alinean was formed by ex-Gartner staff and the research behind core IO was conducted by IDC and Wipro. Each partner has it’s own service methodology wrapped around the toolset but the basic principles are similar. An assessment is made of where an organisation is currently at and where they want to be. Capability gaps are assessed and further modelling can help in deriving those areas where investment has the potential to yield the greatest business benefit and what will be required in order to deliver such results.

It’s important to note that this is not just a technology exercise – there is a balance to be struck between people, processes and technology. Microsoft has published a series of implementer resource guides to help organisations to make the move from basic to standardised, standardised to rationalised and from rationalised to dynamic.

Links

Core infrastructure self-assessment.
Microsoft infrastructure optimisation journey.

Microsoft’s support policy for software running in a non-Microsoft VM

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’m troubleshooting some problems with my Exchange server at the moment and the ExBPA led me to a knowledge base article about running Exchange Server in a virtualised environment. Whilst reading that, I can across Microsoft knowledge base article 897615, which discusses the support policy for Microsoft software running in non-Microsoft hardware virtualisation software.

I’ll paraphrase it as “If you have Premier support and you use our virtualisation software, we’ll try and work out what the issue is (we use Virtual Server 2005 R2 to do that anyway). If you don’t have Premier support, then you should, and you need to proove that it’s nothing to do with virtualisation (i.e. can you replicate the issue on physical hardware). If you have a Premier agreement but you use another vendor’s virtualisation software then we’ll try our best, but you’ll probably have to proove the problem is not caused by the virtualisation software”. The crux of this is the statement that:

“Microsoft does not test or support Microsoft software running in conjunction with non-Microsoft hardware virtualization software.”

This might be worth considering whilst selecting which (if any) virtualisation platform is right for an organisation.

WSUS 3.0 delivers huge improvements for the deployment of Microsoft updates

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’ve been an advocate of Microsoft SUS/WSUS since the v1.0 release. Sure, there are better enterprise software deployment products out there (Microsoft even has one – Systems Management Server) but as a low cost (free) patch management solution for Windows, it’s hard to beat Windows Software Update Services (which, since version 2.0, will update more than just Windows – WSUS 2.0 can act as a local cache for all updates that are available through the Microsoft Update servers). Except that now it has been beaten – by Windows Server Update Services (note the subtle name change) 3.0.

WSUS 3.0 was launched a couple of months ago and I finally installed it this afternoon. Not only does it include some great new features (like e-mail notification, improved reporting and computer management) but it finally gets an MMC administration interface (a huge improvement on the previous web administration interface). There are database changes too – WSUS no longer supports SQL Server 2000/MSDE (after all, those products are shortly to be retired), although it will upgrade an existing database.

The only downside that I can see is that the product still relies on clients connecting to the server and pulling updates (there is no option to force updates on clients – at least not as far as I can see). That’s fine but it does introduce some latency into the process (i.e. if there is an urgent patch to deploy, then WSUS is probably not the right tool to use); however, for the basic operational task of keeping a Windows infrastructure patched (for Microsoft products) and reporting on the current state, WSUS is definitely worth considering.

Further Information

WSUS 3.0 distributed network improvements (white paper).
WSUS 3.0 Usability improvements (white paper).

The Microsoft-Novell alliance – good, bad or ugly?

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A few weeks back, I attended a Novell webcast about last year’s Novell-Microsoft collaboration agreement. Although that particular event was for partners, I’ve since found that the same presentation is available to a wider audience so I’m not breaching any NDAs by writing a bit more here about what this is all about.

We live in a heterogeneous world; most of the world’s data centres run a combination of mainframe operating systems, Unix, Windows and Linux. As commodity server hardware takes hold, many organisations previously running Unix-derived operating systems are starting to look at Linux (what Novell don’t say is that many won’t consider running Linux because of concerns about the supportability of open source software). Clearly a move from Unix to Linux is easier than a move to Windows, so (according to Novell), Microsoft has taken the pragmatic approach and partnered with Novell, who claim that SUSE Enterprise Linux is more prevalent in data centres than Rad Hat – the number one Linux distribution (I’m sure that Microsoft would argue that Windows Server 2003 and 2008 include better integration with and application support for Unix-like operating systems).

The Novell-Microsoft collaboration agreement focuses on three technology areas:

  • Virtualisation – virtualisation is a hot topic and the various competing technologies each take a different approach. Novell and Microsoft consider their solutions (with interoperability options for Xen and Windows Server Virtualisation) to give the best in performance, support, interoperability, cost and management (I’d say that’s not yet true, but may soon become closer to the truth with Windows Server Virtualization). Novell are quick to point out that Red Hat now include Xen (since Red Hat Enterprise Linux 5) but only support their own operating system in a virtual environment whereas Novell will support Red Hat, SUSE and Windows (NT/2000/2003) guests.
  • Heterogeneous systems management – today’s server management products are a minefield of standard-based and proprietary software. Under the Novell-Microsoft collaboration deal, the two companies will co-sponsor and contribute to a number of open source WS-Management products. They will also improve federation between Microsoft Active Directory and Novell eDirectory with WS-Federation and WS-Security.
  • Document format capability – Novell describes Microsoft as having a “heathy market share” (I’d call that an understatement – others might consider Microsoft’s dominance of the Office productivity application market to be unhealthy). Novell considers the open document format (ODF) to be growing in support (if not from Microsoft) and project that it will soon become the standard for governments. Under the agreement, Microsoft and Novell will co-operate to make it easier for customers use either or both Open XML and ODF formats.

Under the terms of the arrangement, Microsoft has purchased vouchers that may be exchanged for copies of SUSE Enterprise Linux and will issue them to customers who are looking at Linux in a cross-licensing arrangement that indemnifies SUSE Enterprise Linux users from patent infringement claims – as discussed in episode 93 of the Security Now podcast (transcript) – in return, Novell hopes to become the Enterprise Linux of choice and has issued a similar covenant to indemnify Microsoft customers against claims on their patents.

Remember that this information has come from Novell – not Microsoft – and there is a lot of fear uncertainty and doubt (FUD) circulating at present about Microsoft’s true motives for a Microsoft-Linux alliance (including rumours of open source software’s wide infringement on Microsoft’s software patents).

As an infrastructure architect working for systems integrator, my personal view is that anything that leads to interoperability improvements is a bonus. I’m not sure that’s what we have here – the Microsoft-Novell relationship seems (to me) to be more about marketing than anything substantive (although they have announced a joint technical roadmap) but we’ll see how this works out – it has certainly got the Linux movement up in arms as Microsoft has announced further partnerships with some less significant distributions (including Xandros and Linspire) and consumer electronics giants who use Linux in their products (notably Samsung and LG).

It will be interesting to see how Ubuntu reacts over time (Ubuntu founder, Mark Shuttleworth’s latest reaction is neither hostile nor approving although he did earlier incite OpenSUSE developers to defect to Ubuntu and can now be quoted as saying:

“We have declined to discuss any agreement with Microsoft under the threat of unspecified patent infringements.”

[Mark Shuttleworth, founder of the Ubuntu project]

I’m certainly not expecting a Microsoft deal from the number one Linux distribution:

“We believe…

It was inevitable. The best technology has been acknowledged.

The relentless march of open source is shaking up the industry by freeing customers from proprietary lock-in and lack of choice.

[…]

We will not compromise.”

[Red Hat statement on the Microsoft Novell announcement]

There’s more from Red Hat’s Mark Webbink and ars technica has a good review of why he is ever-so-slightly misguided in his assertion that:

“These guys made noise. Larry Ellison had the effect he wanted to have, and our stock price went down. But let’s see where we all are a year from now. We will still be standing. We still believe that we will be the dominant player in the Linux market because, by that time, there won’t be any other Linux players. We will have succeeded once again.”

[Enterprise Linux News – Red Hat: We will be here in one year, Novell will not.]

Whilst I’ve not spoken to anybody at Microsoft on this particular topic, it does strike me that Microsoft employees are, by and large, either extremely defensive, or a touch arrogant, when open source software is mentioned (to be fair, so are representatives of many companies if you ask them to talk about the competition). Maybe Microsoft can help make a better Linux (as the Linspire agreement suggests) but will they? Well, for one example, they rejected my feature request for Linux client support in Windows Home Server; and one Microsoft employee had a good point when we were discussing my desire to see (ideally not DRM at all, but more realistically) a single cross-platform standards-based DRM solution – “would [Linux users] accept a solution from Microsoft?” (to which I would append “, Apple or any other closed source vendor?”) – probably not.

Further information

Microsoft Interoperability.
Novell/Microsoft more interop.
Novell and Microsoft collaborate – customers win.

Is a picture worth a thousand words?

Novell's new business strategy (from ars technica)ars technica has a visual timeline of the Novell-Microsoft controversy, including this gem of an illustration for Novell’s apparent business strategy.

Steve Jobs and Bill Gates at D5

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I know I’m a bit late posting this, but the Steve Jobs and Bill Gates interview at the D5 conference is available for free download at the iTunes Store (audio or video). Love them or hate them, these two pioneers of the personal computing world have far more in common than the media (and the fanboys) would generally let us believe and I personally found it very interesting.

Getting tactile

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Finger creating surface ripple2007 seems to be the year of touch computing. It started at MacWorld with the Apple iPhone announcement. Then, HTC introduced a touch phone that runs Windows (before the iPhone made it to market). Now, Microsoft has come up with Surface – a table that runs Windows using a touch-screen interface to very good effect.

I’m not really sure that this is a product that’s going anywhere fast (and I’ll spare you the Bill Gates demo – the Associated Press one is less likely to send you to sleep) but Microsoft is constantly being criticised for a lack of innovation and as a concept, Surface is certainly interesting. Personally, I can’t wait. Not to have an expensive coffee table upon which to bore people with digital photos (I can already do that with the TV!) but because I can feel a return to the “Space Invaders” tabletop video games of my youth coming on!

Migrating WSUS to a new server without downloading all the updates

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’ve spent the last day or so decommissioning my old domain controller, which also doubled up as a DNS, WINS, DHCP, print, RIS, anti-virus and WSUS server (okay… so a bit more than doubled up then!). Migrating Active Directory/DNS/WINS services was straightforward – it just involved setting up the new server, replicating the data, updating client settings (via DHCP) then removing the old services. DHCP was similarly straightforward (I’ve blogged before about migrating DHCP databases between servers) and RIS just needed to be installed on the new server, the images copied across, and the remote installation services setup wizard run. I recreated my single print queue manually but I could just as well have used the Microsoft Windows Server 2003 Print Migrator. That left just left my anti-virus management console (reinstall the console and reconfigure the clients) and WSUS.

I could just have installed WSUS, resynchronised with Microsoft Update and approved the necessary updates; however that would have involved downloading more than 10GB of updates (which could have taken my bandwidth usage for the month to a level that would result in my Internet connection being throttled under my ISP’s sustainable usage policy).

One potential WSUS migration option would have been to backup and restore the WSUS configuration but I wasn’t convinced about how that would work in a migration scenario involving a change of servername. Then I found a blog post from Nathan Winters about migrating WSUS between servers which helped me to import the content without going out to the Internet and downloading it again. Nathan suggests that the approvals database gets imported too, but that’s not the case – the wsusutil import command only imports the update metadata (not the file, approvals, or server settings). Similarly wsusutil migratesus migrates approvals from a SUS server (not WSUS) and wsusutil movecontent is for moving the content within the local file system. More details on managing WSUS from the command line can be found in the Microsoft Windows Server TechCenter.

By chance, I’d installed my new WSUS server as a replica of the original one so I could synchronise with the old server as my upstream source, leaving the new server with the content (from a a manual file copy followed by a metadata import) and the approvals information (from the synchronisation with the old server). All that remained was to finalise the server settings (synchronisation options etc.) and update group policy so that my clients went to the new server.

I hit a problem when I found that WSUS 2.0 doesn’t allow replica servers to be converted to standalone mode (that’s expected when WSUS 3.0 is released later this year), effectively preventing me from repointing WSUS to download updates from Microsoft Update. Luckily, Mohammed Athif Khaleel’s PatchAholic blog features a post on changing the mode of a WSUS server and a follow-up comment from SpJam includes a script to switch a server from replica to standalone mode (modified here to reflect subsequent comments):

rem Restore values after exec spEnableReplica stored procedure
“%ProgramFiles%\Update Services\tools\osql\osql.exe” -S %COMPUTERNAME%\WSUS -E -b -n -Q “USE SUSDB UPDATE dbo.tbConfigurationA SET SyncToMU = ‘1′ UPDATE dbo.tbConfigurationB SET AutoRefreshDeployments = ‘1′ UPDATE dbo.tbConfigurationC SET ReplicaMode = ‘0′ UPDATE dbo.tbConfigurationC SET AutoDeployMandatory = ‘1′ UPDATE dbo.tbAutoDeploymentRule SET Enabled = ‘0′”

rem Add removed values in tables
“%ProgramFiles%\Update Services\tools\osql\osql.exe” -S %COMPUTERNAME%\WSUS -E -b -n -Q “USE SUSDB Insert into dbo.tbTargetGroupInAutoDeploymentRule(AutoDeploymentRuleID, TargetGroupID) values (1, ‘A0A08746-4DBE-4a37-9ADF-9E7652C0B421′)”
“%ProgramFiles%\Update Services\tools\osql\osql.exe” -S %COMPUTERNAME%\WSUS -E -b -n -Q “USE SUSDB Insert into dbo.tbTargetGroupInAutoDeploymentRule(AutoDeploymentRuleID, TargetGroupID) values (2, ‘A0A08746-4DBE-4a37-9ADF-9E7652C0B421′)”
“%ProgramFiles%\Update Services\tools\osql\osql.exe” -S %COMPUTERNAME%\WSUS -E -b -n -Q “USE SUSDB Insert into dbo.tbUpdateClassificationInAutoDeploymentRule(AutoDeploymentRuleID, UpdateClassificationID) values (1, 1)”
“%ProgramFiles%\Update Services\tools\osql\osql.exe” -S %COMPUTERNAME%\WSUS -E -b -n -Q “USE SUSDB Insert into dbo.tbUpdateClassificationInAutoDeploymentRule(AutoDeploymentRuleID, UpdateClassificationID) values (1, 5)”
“%ProgramFiles%\Update Services\tools\osql\osql.exe” -S %COMPUTERNAME%\WSUS -E -b -n -Q “USE SUSDB Insert into dbo.tbUpdateClassificationInAutoDeploymentRule(AutoDeploymentRuleID, UpdateClassificationID) values (2, 1)”
“%ProgramFiles%\Update Services\tools\osql\osql.exe” -S %COMPUTERNAME%\WSUS -E -b -n -Q “USE SUSDB Insert into dbo.tbUpdateClassificationInAutoDeploymentRule(AutoDeploymentRuleID, UpdateClassificationID) values (2, 5)”

It looked as if the script worked as advertised (except that automatic approval options were still not available) until I started to encounter the following error message when running reports or attempting to view update information:

Windows Server Update Services error

Error connecting to the Windows Server Update Services database
There was an error connecting to the Windows Server Update Services database. Either the database is not available or you do not have the correct privileges to access the database.

If you believe you have received this message in error, please check with your system administrator.

Click here to reload the site: Windows Server Update Services

Thinking that I had corrupted the database and that I might need to go back and start the WSUS migration from scratch, I decided to restart the server “just in case”. After the restart, everything seemed to be working (including the previously-missing automatic approval options). I’ve since approved some more updates and run various reports and (so far) there have been no problems administering WSUS.

The final step was to edit the group policy that I use to control automatic update options on my clients – a minor edit to change the server which clients should contact for updates.

So, to summarise, my WSUS migration process was:

  1. Install BITS 2.0 (a fully-patched Windows Server 2003 server should already have this).
  2. Install WSUS (in replica mode) and WMSDE.
  3. Export the update metadata on the old server using %programfiles%\Update Services\Tools\wsusutil export filename.cab logfilename.txt.
  4. Copy filename.cab (created above) and the contents of the WsusContent folder to the new server (e.g. using an external disk to network connectivity issues).
  5. Import the update metadata using %programfiles%\Update Services\Tools\wsusutil import filename.cab logfilename.txt (note that this takes a long time – it was just over three hours in my case).
  6. Synchronise WSUS with an upstream server.
  7. Save the script above as filename.cmd and execute it from the command line. The output will detail each command followed by the number of affected rows in the database.
  8. Reboot the server.
  9. Configure server settings (e.g. set Microsoft Update as the update source) and administer WSUS as normal.

I’d be interested to hear if anyone has any variations on this approach – for example, I don’t really recommend installing WSUS in replica mode and then hacking the database (and this wouldn’t be an option if there was any network segregation in place). Indeed, since I completed the exercise I found reference to a tool called WSUSMigrate which is part of the WSUS API samples and tools and can be used to migrate the approvals data – that looks like a much better approach.

Keeping up with the news (plus some tips for Windows Vista)

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Clicking through from one of Victor Laurie’s sites, I found Ed Bott’s 10 expert tips and tweaks for Windows Vista – it looks as though there are some nice tips there.

I regularly read Paul Thurrott‘s writing (as well as listening to his Windows Weekly podcast with Leo Laporte) and I occasionally check out what Stephen Bink and Ryan Hoffman have to say but it seems Ed Bott’s Microsoft Report is another useful resource for those keeping up-to-date with the latest news from Redmond (there is also the official Microsoft news is at Presspass, but it’s all so clinical and corporate).

Of course, Ed Bott writes at ZDNet, who have loads of writers churning out news on Microsoft, Google, Apple and others but it’s just so hard to keep up (and RSS feeds are worsening my information overload instead of making it better!) – just thought I’d make a note of it up here on the blog in case it turns out useful for someone.

Does the world really need another search engine?

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Windows Live
Two of London’s free newspapers for commuters (Metro and The London Paper) are featuring wrap-around ads for Microsoft’s Windows Live Live Search today. The front page is almost entirely blank, save for a search box which asks “Does the world really need another search engine?”:

Does the world really need another search engine?

As Google and Yahoo! have once again extended their lead on Microsoft in the search engine rankings and Google has become the most visited website in the UK, I have to wonder if Microsoft should be asking themselves the same question. It’s all very well emphasising the extra features that Live Search offers – like controlling the size of the results on a single page, hovering over images for more detail, providing bird’s eye views to accompany maps and directions (all very well for pilots and birds, but not so useful on the ground) and personalising results; however, of all organisations, Microsoft should be well aware that it’s not necessarily the product with the best feature set that gains the most market share. Having said that, Google came from nowhere a few years back – and who uses the pioneering Lycos, Excite and Altavista search engines today?

Live Search is certainly impressive and Microsoft’s ads state that:

“To us, search is in its infancy. This is just the start.”

Maybe Live Search will push Google into doing some work to integrate their disparate Web 2.0 applications (many of which seem to be in a perpetual beta state); in the meantime, the message seemed to be lost as I observed commuters at Canary Wharf – one of London’s major commercial centres – simply flicking past the four full page ads to get to the news.

Give Live Search a try at live.com.

Microsoft’s digital identity metasystem

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

After months of hearing about Windows Vista eye candy (and hardly scraping the surface with anything of real substance with regards to the operating system platform), there seems to be a lot of talk about digital identity at Microsoft right now. A couple of weeks back I was at the Microsoft UK Security Summit, where I saw Kim Cameron (Microsoft’s Chief Architect for identity and access) give a presentation on CardSpace (formerly codenamed “InfoCard”) – a new identity metasystem contained within the Microsoft .NET Framework v3.0 (expected to be shipped with Windows Vista but also available for XP). Then, a couple of days ago, my copy of the July 2006 TechNet magazine arrived, themed around managing identity.

This is not the first time Microsoft has attempted to produce a digital identity management system. A few years back, Microsoft Passport was launched as a web service for identity management. But Passport didn’t work out (Kim Cameron refers to it as the world’s largest identity failure). The system works – 300 million people use it for accessing Microsoft services such as Hotmail and MSN Messenger, generating a billion logons each day – but people don’t want to have Microsoft controlling access to other Internet services (eBay used Passport for a while but dropped it in favour of their own access system).

Digital identity is, quite simply, a set of claims made about a subject (e.g. “My name is Mark Wilson”, “I work as a Senior Customer Solution Architect for Fujitsu Services”, “I live in the UK”, “my website is at http://www.markwilson.co.uk/”). Each of these claims may need to be verified before they are acted upon (e.g. a party to whom I am asserting my identity might like to check that I do indeed work where I say I do by contacting Fujitsu Services). We each have many identities for many uses that are required for transactions both in the real world and online. Indeed, all modern access technology is based on the concept of a digital identity (e.g. Kerberos and PKI both claim that the subject has a key showing their identity).

Microsoft’s latest identity metasystem learns from Passport – and interestingly, feedback gained via Kim Cameron’s identity weblog has been a major inspiration for CardSpace. Through the site, the identity community has established seven laws of identity:

  1. User control and consent.
  2. Minimal disclosure for a defined use.
  3. Justifiable parties.
  4. Directional identity.
  5. Pluralism of operators and technologies.
  6. Human integration.
  7. Consistent experience across contexts.

Another area where CardSpace fundamentally differs from Passport is that Microsoft is not going it alone this time – CardSpace is based on WS-* web services and other operating system vendors (e.g. Apple and Red Hat) are also working on comparable (and compatible) solutions. Indeed, the open source identity selector (OSIS) consortium has been formed to address this technology and Microsoft provides technical assistance to OSIS.

The idea of an identity metasystem is to unify access and prevent applications from the complexities of managing identity, but in a manner which is loosely coupled (i.e. allowing for multiple operators, technologies and implementations). Many others have compared this to the way in which TCP/IP unified network access, which paved the way for the connected systems that we have today.

The key players in an identity metasystem are:

  • Identity providers (who issue identities).
  • Subjects (individuals and entities about which claims are made).
  • Relying parties (require identities).

Each relying party will decide whether or not to act upon a claim, depending on information from an identity provider. In the real world scenario, that might be analogous to arriving at a client’s office and saying “Hello, I’m Mark Wilson from Fujitsu Services. I’m here to visit your IT Manager”. The security/reception staff may take my word for it (in which case this is self-issued identity and I am both the subject and the provider) or they may ask for further confirmation, such as my driving license, company identity card, or a letter/fax/e-mail inviting me to visit.

In a digital scenario the system works in a similar manner. When I log on to my PC, I enter my username to claim that I am Mark Wilson but the system will not allow access until I also supply a password that only Mark Wilson should know and my claims have been verified by a trusted identity provider (in this case the Active Directory domain controller, which confirms that the username and password combination matches the one it has stored for Mark Wilson). My workstation (the relying party) then allows me access to applications and data stored on the system.

In many ways a username and password combination is a bad identity analogy – we have trained users to trust websites that ask them to enter a password. Imagine what would happens if I was to set up a phishing site that asks for a password. Even if the correct password is entered then the site would claim that it was incorrect. A typical user (and I am probably one of those) will then try other passwords – the phishing site now has an extensive list of passwords available which can then be used to access other systems pretending to be the user whose identity has been stolen. A website may be protected by many thousands miles of secure communications but as Kim Cameron put it, the last one metre of the connection is from the computer to the user’s head (hence identity law number 6 – human integration) – identity systems need to be designed in a way that is easy for users to make sense of, whilst remaining secure.

CardSpace does this by presenting the user with a selection of digital identity cards (similar to the plastic cards in our wallets) and highlighting only those that are suitable for the site. Only publicly available information is stored with the card (so that should hold phishers at bay – the information to be gained is useless to them) and because each card is tagged with an image (and only appropriate cards are highlighted for use), I know that I have selected the correct identity (why would I send my Government Gateway identity to a site that claims to be my online bank?). Digital identities can also be combined with other access controls such as smartcards. The card itself is just a user-friendly selection mechanism – the actual data transmitted is XML-based.

CardSpace runs in a protected subsystem (similar to the Windows login screen) – so when active there is no possibility of another application (e.g. malware) gaining access to the system or of screenscraping taking place. In addition, user interaction is required before releasing the identity information.

Once selected, services that require identities can convert the supplied token between formats using the WS-Trust service for encapsulating protocol and claims transformation. For negotiations, WS-MetadataExchange and WS-SecurityPolicy are used. This makes the Microsoft implementation fully interoperable with other identity selector implementations, with other relying party implementations and with other identity provider implementations.

Microsoft is presently building a number of components to its identity metasystem:

  • CardSpace identity selector (usable by any application, included within .NET Framework v3.0 and hardened against tampering and spoofing).
  • CardSpace simple self-issued identity provider (makes use of strong PKI so that the user does not disclose passwords to relying parties).
  • Active Directory managed identity provider (to plug corporate users in to the metasystem via a full set of policy controls to manage the use of simple identities and Active Directory identities).
  • Windows Communication Foundation (for building distributed applications and implementing relying party services.

Post-Windows Vista, we can expect the Windows Login to be replaced with an CardSpace-based system. In the meantime, to find out more about Microsoft’s new identity metasystem, check out Kim Cameron’s identity blog, The Windows CardSpace pages and David Chappell’s Introducing InfoCard article on MSDN, and the July 2006 issue of TechNet magazine.