Microsoft infrastructure architecture considerations: part 2 (remote offices)

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Continuing from my earlier post which sets the scene for a series of posts on the architectural considerations for designing a predominantly-Microsoft IT infrastructure, in this post, I’ll look at some of the considerations for remote offices.

Geographically dispersed organisations face a number of challenges in order to support remote offices including: WAN performance/reliability; provisioning new services/applications/servers; management; remote user support; user experience; data security; space; and cost.

One approach that can help with some (not all) of these concerns is placing a domain controller (DC) in each remote location; but this has been problematic until recently because it increases the overall number of servers (it’s not advisable to co-locate other services on a domain controller because administration can’t be delegated to a local administrator on a domain controller and the number of Domain Admins should be kept to a minimum) and it’s a security risk (physical access to the domain controller computer makes a potential hacker’s job so much simpler). For that reason, Microsoft introduced read only domain controllers (RODCs) in Windows Server 2008.

There are still some considerations as to whether this is the appropriate solution though. Benefits include:

  • Administrative role separation.
  • Faster logon times (improved access to data).
  • Isolated corruption area.
  • Improved security.

whilst other considerations and potential impacts include:

  • The need for a schema update.
  • Careful RODC placement.
  • Impact on directory-enabled applications.
  • Possibility of site topology design changes.

Regardless of whether a remote office DC (either using the RODC capabilities or as a full DC) is deployed, then server sprawl (through the introduction of branch office servers for a variety of purposes) can be combatted with the concept of a branch “appliance” – not in the true sense of a piece of dedicated hardware runnings an operating system and application that is heavily customised to meet the needs of a specific service – but by applying appliance principles to server design and running multiple workloads in a manner that allows for self-management and healing.

The first step is to virtualise the workloads. Hyper-V is built into Windows Server 2008 and the licensing model supports virtualisation at no additional cost. Using the server core installation option, the appliance (physical host) management burden is reduced with a smaller attack surface and reduced patching. Multiple workloads may be consolidated onto a single physical host (increasing utilisation and removing end-of-life hardware) but there are some downsides too:

  • There’s an additional server to manage (the parent/host partition) and child/guest partitions will still require management but tools like System Center Virtual Machine Manager (SCVMM) can assist (particularly when combined with other System Center products).
  • A good business continuity plan is required – the branch office “appliance” becomes a single point of failure and it’s important to minimise the impact of this.
  • IT staff skills need to be updated to manage server core and virtualisation technologies.

So, what about the workloads on the branch office “appliance”? First up is the domain controller role (RODC or full DC) and this can be run as a virtual machine or as an additional role on the host. Which is “best” is entirely down to preference – running the DC alongside Hyper-V on the physical hardware means there is one less virtual machine to manage and operate (multiplied by the number of remote sites) but running it in a VM allows the DC to be “sandboxed”. One important consideration is licensing – if Windows Server 2008 standard edition is in use (which includes one virtual operating system environment, rather than enterprise edition’s four, or datacenter edition’s unlimited virtualisation rights) then running the DC on the host saves a license – and there is still some administrative role separation as the DC and virtualisation host will probably be managed centrally, with a local administrator taking some responsibility for the other workloads (such as file services).

That leads on to a common workload – file services. A local file server offers a good user experience but is often difficult to back up and manage. One solution is to implement DFS-R in a hub and spoke arrangement and to keep the backup responsibility data centre. If the remote file server fails, then replication can be used to restore from a central server. Of course, DFS-R is not always idea for replicating large volumes of data; however the DFS arrangement allows users to view local and remote data as though it were physically stored a single location and there have been a number of improvements in Windows Server 2008 DFS-R (cf. Windows Server 2003 R2). In addition, SMB 2.0 is less “chatty” than previous implementations, allowing for performance benefits when using a Windows Vista client with a Windows Server 2008 server.

Using these methods, it should be possible to avoid remote file server backups and remote DCs should not need to be backed up either (Active Directory is a multi-master replicated database so it has an inherent disaster recovery capability). All that’s required is some method of rebuilding a failed physical server – and the options there will depend on the available bandwidth. My personal preference is to use BITS to ensure that the remote server always holds a copy of the latest build image on a separate disk drive and then to use this to rebuild a failed server with the minimum of administrator intervention or WAN traffic.

In the next post in these series, I’ll take a look at some of the considerations for using network access protection to manage devices that are not compliant with the organisation’s security policies.

Microsoft infrastructure architecture considerations: part 1 (introduction)

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last week, I highlighted the MCS Talks: Enterprise Architecture series of webcasts that Microsoft is running to share the field experience of Microsoft Consulting Services (MCS) in designing and architecting Microsoft-based infrastructure solutions – and yesterday’s post picked up on a key message about software as a service/software plus services from the infrastructure futures section of session 1: infrastructure architecture.

Over the coming days and weeks, I’ll highlight some of the key messages from the rest of the first session, looking at some of the architectural considerations around:

  • Remote offices.
  • Controlling network access.
  • Virtualisation.
  • Security.
  • High availability.
  • Data centre consolidation.

Whilst much of the information will be from the MCS Talks, I’ll also include some additional information where relevant, but, before diving into the details, it’s worth noting that products rarely solve problems. Sure enough, buying a software tool may fix one problem, but it generally adds to the complexity of the infrastructure and in that way does not get to the root issue. Infrastrcture optimisation (even a self assessment) can help to move IT conversations to a business level as well as allowing the individual tasks that are required to reach meet the overall objectives to be prioritised.

Even though the overall strategy needs to be based on business considerations, there are still architectural considerations to take into account when designing the technical solution and, even though this series of blog posts refers to Microsoft products, there is no reason (architecturally) why alternatives should not be considered.

Software as a Service – or Software plus Services?

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

There’s a lot of media buzz right now about cloud computing – which encompasses both “web 2.0” and Software as a Service (SaaS). Whilst it’s undeniable that web services are becoming increasingly more important, I’ll stand by my comments from a couple of years ago that the “webtop” will not be in mainstream use any time soon and those who are writing about the the death of Microsoft Windows and Office are more than a little premature.

Even so, I was interested to hear Microsoft’s Kevin Sangwell explain the differences between SaaS and the Microsoft idea of software plus services (S+S) during the recent MCS Talks session on infrastructure architecture.

I’ve heard Microsoft executives talk about software plus services but Kevin’s explanation cut’s through the marketing to look at what S+S really means in the context of traditional (on premise) computing and SaaS:

Kevin made the point that there is actually a continuum between on premise and SaaS solutions:

Software delivery continuum and software services taxonomy

  • We all understand the traditional software element – where software is installed an operated in-house (or possibly using a managed service provider).
  • Building block services are about using web services to provide an API to build applications “in the cloud” – so Amazon’s simple storage service (S3) is an example. This gives developers something to hook into and onto which to deliver a solution – for example, Jungle Disk uses the Amazon S3 platform to provide online storage and backup services.
  • Attached services provide self-contained functionality – for example anti-spam filtering of e-mail as it enters (or exits) an organisation.
  • Finished services are those that operate entirely as a web service – with salesforce.com being one, often quoted, example – Google Apps would be another (not that Microsoft are ever likely to promote that one…).

S+S is about creating a real-world hybrid – not just traditional or cloud computing but a combination of software and services – for example an organisation may use a hosted Exchange Server service but they probably still use Microsoft Outlook (or equivalent software) on a PC.

So, would moving IT services off to the cloud make all the associated IT challenges disappear? Almost certainly not! All this would lead to is a disjointed service and lots of unhappy business users. SaaS and S+S do not usually remove IT challenges altogether but they replace them with new ones – typically around service delivery (e.g. managing service level agreements, integrating various operational teams, etc.) and service support (e.g. presenting a coherent service desk with appropriate escalation between multiple service providers and the ability to assess whether a problem relates to internal IT or the hosted service) but also in relation to security (e.g. identity lifecycle management and information rights management).

Kevin has written an article for The [MSDN] Architecture Journal on the implications of software plus services consumption for enterprise IT and, for those who are interested in learning more about S+S, it’s worth a read.

So, you want to be an infrastructure architect?

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Over the years I’ve had various jobs which have been basically the same role but with different job titles. Officially, I’ve been a Consultant, Senior Consultant, Project Manager, Senior Technical Consultant, Senior Customer Solution Architect (which would have been a Principal Consultant in the same organisation a few years earlier but management swapped the “architect” word for a drop in implied seniority) but if you ask me what I am, I tend to say I’m an infrastructure architect.

Issue 15 of The [MSDN] Architecture Journal included an article about becoming an architect in a systems integrator. I read this with interest, as that’s basically what I do for a living (believe me, I enjoy writing about technology but it will be a long while before I can give up my day job)!

The Architecture Journal tends to have an application focus (which is only natural – after all, it is produced by developer-focused group in a software company) and I don’t know much about application development but I do know how to put together IT solutions using common off the shelf (COTS) applications. I tend to work mostly with Microsoft products but I’ve made it my business to learn about the alternatives (which is why I’m a VMware Certified Professional and an Red Hat Certified Technician). Even so, I’m stuck at a crossroads. I’m passionate about technology – I really like to use it to solve problems – but I work for a managed services company (an outsourcer in common parlance) where we deliver solutions in the form of services and bespoke technology solutions are not encouraged. It seems that, if I want to progress in my current organisation, I’m under more and more pressure to leave my technical acumen behind and concentrate on the some of the other architect’s competencies.

Architect competencies

I’m passionate about technology – I really like to use it to solve problems

I understand that IT architecture is about far more than just technology. That’s why I gained a project management qualification (since lapsed, but the skills are still there) and, over the years, I’ve developed some of the softer skills too – some which can be learnt (like listening and communications skills) – others of which only come with experience. I think it’s important to be able to dive into the technology when required (which, incidentally, I find helps to earn the respect of your team and then assists with the leadership part of the architect’s role) but just as important to be able to rise up and take a holistic view of the overall solution. I know that I’m not alone in my belief that many of the architects joining our company are too detached from technology to truly understand what it can do to address customers’ business problems.

Architect roles
OK, so I’m a solutions architect who can still geek out when the need arises. I’m still a way off becoming an enterprise architect – but do I really need to leave behind my technical skills (after having already dumped specialist knowledge in favour of breadth)? Surely there is a role for senior technologists? Or have I hit a glass ceiling, at just 36 years of age?

I’m hoping not – and that’s why I’m interested in the series of webcasts that Microsoft Consulting Services are running over the next few months – MCS Talks: Enterprise Architecture. Session 1 looked at infrastructure architecture (a recorded version of the first session is available) and future sessions will examine:

  • Core infrastructure.
  • Messaging.
  • Security and PKI.
  • Identify and access management.
  • Desktop deployment.
  • Configuration management.
  • Operations management.
  • SharePoint.
  • Application virtualisation.

As should be expected, being delivered by Microsoft consultants, the sessions are Microsoft product-heavy (even the session titles give that much away); however the intention of the series is to connect business challenges with technology solutions and the Microsoft products mentioned could be replaced with alternatives from an other vendors. More details on the series can be found on the MCS Talks blog.

This might not appeal to true enterprise architects but for those of us who work in the solution or technical architecture space, this looks like it may well be worth an hour or so of our time each fortnight for the rest of the year. At the very least it should help to increase breadth of knowledge around Microsoft infrastructure products.

And, of course, I’ll be spouting forth with my own edited highlights on this blog.

Bill Gates’ last day at Microsoft

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

So, after a 2 year transition, today is the day that Bill Gates steps down from his full-time job at Microsoft (although he will remain Microsoft’s chairman and will be involved in select projects based on direction from CEO Steve Ballmer and the rest of Microsoft’s leadership team).

The original founders of MicrosoftI commented on Gates’ departure a couple of years back and more recently wrote about Mary-Jo Foley’s concept of Microsoft 2.0.

It’s 33 years since Microsoft was formed and 30 years since the famous photo with most of the founding employees was taken in Albequrque. 30 years is a long time in IT. The remaining Microsoft Founders- shortly before Bill Gates' retirementCome to think of it, 30 years is most of my life (I’m 36) and I was interested to read about how the famous photo had been recreated for 2008.

Meanwhile, Stephen Levy has written an article for Newsweek entitled “Microsoft After Gates. (And Bill After Microsoft.)”.

There’s a Microsoft video looking back at Gates’ life – and forward to the future but I prefer the version from the 2008 CES keynote:

Some people love to hate Microsoft. Some people can’t stand other people being successful – and it’s difficult to deny that Gates has been successful. For 14 years now, I’ve followed a career in IT, during which I’ve worked largely with Microsoft products, so I’d like to say “thank you and good luck” to the world’s most famous geek as he does what all of the world’s richest people should do at some stage in their life and changes his focus to work with helping those who are less fortunate.

It’s OK to use a Mac at Microsoft (really)

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Just before I went on my holidays, I changed my password for the Active Directory domain that I log on to at work. I wrote it down (bad practice, but I have a lot of passwords to remember…) but when I returned and tried to log on it didn’t work. I tried the old one too, but after 2 weeks of going cold turkey with no Internet access, my fingers had forgotten the old password (as had my brain) so I kept trying various options. No good. Locked out of my work PC – I thought I’d have to log a support call (oh joy!) and visit the office to access the network…

The next day I wa due to be attending an all-day event at Microsoft, so I got my personal notebook PC (an Apple MacBook) ready with all the things I would need, and went to sleep for the night.

As is the way with these things, my slumber was disturbed by the sudden realisation that, because I had been disconnected from the network when I changed my password, my cached credentials on the computer used the old password, but my VPN connection would use the new one. And I could remember both the passwords. But, needing as much beauty sleep as I can cram in (beleive me, I need it), I settled back down and took the MacBook with me to Microsoft anyway.

So, there I was, at a Microsoft event with 250 other people, in the middle of a dark room using a white notebook PC with a big Apple logo lit up for all to see. I wasn’t making a statement. It was running Vista (at least in a virtual machine!). And I was using OneNote to take meeting notes – honestly!

Thanks to Steve Lamb – who was presenting that morning – for somehow weaving his MacBook Air into the content of his presentation and making me feel better about using a Mac on the UK Microsoft Campus. MacBook with a note over the logo to highlight that Vista is the running OSNow I have some idea how Paul Thurrott felt at last year’s Windows Server 2008 Worldwide Technical Workshop when he even put a note over the Apple logo to put the presenters’ minds at rest and explain he was running Vista!

Microsoft Licensing: Part 9 (useful links)

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

When I set out to write a series on Microsoft software licensing, I never expected there to be a total of nine posts.

For those who missed the others, they were:

  1. Client and server.
  2. Licensing without CALs.
  3. Server products.
  4. System Center products.
  5. Virtualisation.
  6. Forefront security products.
  7. How to buy Microsoft software.
  8. Software Assurance.

In this final post in the series, I’ll provide some useful links to Microsoft software licensing resources.

To start off with, there’s the Microsoft Licensing Reseller Handbook – intended for partners but publicly available and packed with links.

For those interested in volume licensing, the Microsoft Volume Licensing Reference Guide is intended for customers and explains the various options that are available.

Next up, there’s Emma (Lady Licensing)’s Licensing and Software Asset Management blog… which has a stack of information but I do find it a little odd that almost the first thing you read on the site is a notice that shouts “DO NOT COPY CONTENT WITHOUT MY EXPRESS PERMISSION” when the whole point is about sharing information (try Creative Commons) and much of that information appears to be a direct copy and paste from information provided by her employer (Microsoft)!

Other links that might be useful include:

If you have a licensing query and it’s not covered in any of the links here, call Microsoft – for UK customers the number is 0870 60 10 100.

Finally, I mentioned in the first post that this series has been based on information from a TechNet presentation at Microsoft UK – thanks to Jackie Elleker at Microsoft UK for presenting the information in a way that even I could understand – and for answering my many questions. Jackie has also produced a short video with Blue Solutions in which she explains many of the key points of Microsoft Licensing – including payment options.

Microsoft Licensing: Part 8 (software assurance)

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

In my earlier post on how to buy Microsoft software, I mentioned Software Assurance (SA).

SA includes upgrade rights for all software released during the period of the agreement along with a number of additional benefits. Purchased as part of a volume license agreement or on an individual product, SA is a contraversial subject – Microsoft will highlight the many advantages that it offers to customers, whereas IT Managers will often question its value.

Unless included within the terms of an Open Value License or Enterprise Agreement, SA costs between 25 and 29% of the accompanying license price and, although it can be renewed, it ends when the accompanying agreement terminates. An ROI tool is available to help assess the likely financial benefits of SA but the trouble with software is that it’s a bit like the proverbial London Bus – you wait years for a new release and then they all come along at once…

For an IT Manager, this may mean that they don’t percieve their SA as having provided much benefit (e.g. if they didn’t see many new releases during the period of their agreement) but it can also work the other way. For example, I know of at least one Microsoft customer that has not resigned their EA because in the last few years they have gained the rights to upgrade their desktop from XP to Vista, their Office productivity suite from Office 2003 to 2007, their server infrastructure from Windows Server 2003 R2 to 2008 and to perform a number of server application upgrades (Exchange Server 2003 to 2007, Live Communications Server 2005 to Office Communications Server 2007, SharePoint Portal Server 2003 to Office SharePoint Server 2007, Systems Management Server 2003 to System Center Configuration Manager 2007, Operations Manager 2005 to System Center Operations Manager 2007, etc.). Now they have the right to use all of that software so they have their infrastructure upgrades for the next few years “in the bag” and see no reason to resign the EA. That’s not good for Microsoft, but very good for my anecdotal customer.

The full list of SA benefits, at each stage in the lifecycle, includes:

Lifecycle Stage Benefit
Plan New Version Rights
Spread Payments
Deploy Desktop Deployment Planning Services
Information Work Solution Services
Training Vouchers
Use eLearning
Home Use Program
Employee Purchase Program
Windows Vista Enterprise Edition
Desktop Optimisation Pack
Enterprise Source License Program
Maintain 24×7 Problem Resolution Support
TechNet Plus subscription
Cold backups for disaster recovery
Transition

There exact benefits that are available depend on the volume licensing agreement in place and an SA benefits comparison chart is available for download.

One of the major benefits for corporate users with Select or Enterprise agreements is the Microsoft Desktop Optimization Pack (MDOP). This contains five additional technologies: Microsoft Application Virtualization (formerly Softricity SoftGrid); Microsoft System Center Desktop Error Monitoring; Microsoft Asset Inventory Service (formerly AssetMetrix); Microsoft Diagnostics and Recovery Toolset (formerly Winternals Administator’s Pak); and Microsoft Advanced Group Policy Management (formerly DesktopStandard GPOVault).

MDOP is a big pull for many organisations – particularly the Application Virtualization element – but it is a subscription service which means that when the accompanying volume license agreement ends so does the right to use the MDOP tools.

For many, a crystal ball would be useful when deciding if SA is appropriate – it all depends on how an organisation’s roadmap is aligned with new product releases and consequentially whether the benefits of SA will actually be of use. My view is that there are some substantial benefits available – and I’d suggest that the MDOP benefits might actually help to reduce operational costs and therefore finance the SA.

In the final part of this series on software licensing, I’ll summarise the eight posts so far and provide links to further information.

Microsoft Licensing: Part 7 (how to buy Microsoft software)

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Continuing the series on Microsoft licensing, I’m going to look at how to buy Microsoft software. Basically, there are three ways to buy a license:

  • Full packaged product (FPP) – purchased from a retailer and typically a single box contains a single license.
  • Original equipment manufacturer (OEM) – software supplied with a computer and which “lives and dies” on that machine.
  • Volume licensing – purchased from resellers with a variety of programmes to suit different types of organisation.

Technically, there is a fourth method too – software may be made available for download free of charge from the web (although this is still subject to an end user license agreement).

There’s not much to say about buying FPP software – except that it’s the most expensive way to buy software and should be avoided where possible.

OEM software packs are intended for system builders only and are not intended for distribution to end users unless the end users are acting as system builders by assembling their own PCs. Often, it’s possible to purchase OEM software from distributors but there are conditions attached. OEM software is intended for system builders and the system builder license agreement is effectively accepted when the shrink-wrap on the software is broken and acceptance of those terms involves offering support on the product.

Effectively, if I buy OEM software from a distributor and build a PC for someone (even family) with that software pre-applied, I need to offer end-user support.

OEM software requires product activation, is only available as a full product (no upgrades – although software assurance may be available if enrolled by the end user within 30 or 90 days, depending on the product) and must be pre-installed with the certificate of authenticity or proof of license label attached to the hardware. Once installed, the product is only available for use with that computer and cannot be transferred.

Certain OEM software may be legally downgraded, for example Windows Vista Business and Ultimate Editions may be downgraded to Windows XP Professional and Windows Server 2008 may be downgraded to Windows Server 2003 or Windows Server 2000. One notable exception is Office 2007, which cannot be downgraded. Instead, Microsoft has the concept of an Office Ready PC – a 60-day trial version of Office 2007 for pre-installation by the OEM, sold with a medialess license kit (MLK). The end user can upgrade to a full version of Office when the trial ends and end-user technical support is offered by Microsoft, rather than by the OEM.

OEM copies of Windows also include the rights to produce images of the software for deployment.

Before moving on to look at Volume licensing, let’s examine licensing Windows desktop operating systems, where there is one point I need to make crystal clear – FPP and OEM are the only ways to purchase a full Windows desktop operating system license.

This means that if an organisation thinks it can save money by buying PCs without Windows (certain vendors will do this, e.g. a grey box PC with Linux pre-installed) and then apply copies of Windows obtained through a volume license programme, they are not licensed to use Windows. The only way to become legal from this situation is to an FPP copy of Windows. Windows Vista Business upgrade licenses sold though volume license agreements are upgrades only (for Windows XP Professional computers) and are not intended for installation on a “naked” PC.

For any organisation with more than 5 users, Volume licensing programmes are available. Volume licensing separates the license from the media, packaging and support as well as offering flexible rights such as:

  • Downgrade rights.
  • Transfer and reassignment rights (except Windows Vista upgrade – and FPP products have a one-time transfer right).
  • Imaging rights.
  • Flexible payment options.
  • Alternative language use rights.

The type of agreement will depend on the number of users, and whether or not the software is to be purchased (a perpetual license) or leased (non-perpetual):

5-250 PCs >250 PCs
Owned Open License Select License
Open Value License (with SA) Enterprise Agreement (with SA for all PCs)
Open Value License Company Wide (with SA for all PCs)
Leased Open Value License Subscription (with SA for all PCs) Enterprise Subscription Agreement (with SA for all PCs)

In the case of leased (non-perpetual) software, the agreement can be converted or re-purchased upon expiry but if these options are not exercised then the organisation is no longer licensed to use the software.

Open Licenses are sold by resellers via the distribution channel, whereas Select and Enterprise agreements are sold by specialist Large Account Resellers (LARs) via Microsoft.

For organisations looking to standardise their PCs (e.g. for support reasons), Open Value Company Wide or Enterprise Agreements (EAs) can be advantageous.

Software Assurance (SA) includes upgrade rights for all software released as long as the agreement is current. It also includes a number of other benefits (which is why I’ll save a full explanation for another post). SA can either be purchased as part of a volume license agreement or on an individual product.

There are also a number of special licensing arrangements for educational establishments – in addition to Open and Select licensing, Campus Agreement subscriptions and School Agreement subscriptions are available to schools, colleges and universities, as well as local education authorities, public libraries, public museums and some charitable organisations.

Further details of Microsoft Volume Licensing arrangements are available in the Microsoft Volume Licensing Reference Guide.

Microsoft also takes part in the Charity Technology eXchange (CTX) programme, donating software to eligible organisations with a very heavy discount (all that is charged is an administrative fee). Charities can request up to 50 licenses from each of 6 titles (selected from 13 available products) in a two-year product. Eligible charitable organisations are defined as non-profit or non-govermental organisations holding charitable status with the aim of releif to the poor, advancement of education, social and community welfare, culture, the natural environment or other purposes that are beneficial to the community. An FAQ is available with further details on Microsoft’s involvement in CTX.

In the next post in this series, I’ll take a more detailed look at software assurance.

Microsoft Licensing: Part 6 (Forefront security products)

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Continuing the series on licensing Microsoft software, in this post I look at the various security products that Microsoft offers. Many of these products are the result of acquisitions, so it may help to look at the old and new product names:

  • Sybari Antigen is now integrated into Forefront Server Security and Forefront Client Security.
  • FrontBridge services are now sold as Exchange Hosted Services (EHS).
  • The Whale Communications product is now offered as Internet Access Gateway (IAG).
  • Sybari Antigen Enterprise Manager has become the Forefront Server Security Management Console.

The Forefront security products make use of multiple anti-virus engines, with five engines included in the base cost (CA InnoculateIT, CA VET, Microsoft Antivirus, Norman DataDefense and Sophos) and four more optional engines available (AhnLabs, Authentium, Kaspersky and Virus Busters). Included within the Forefront Security Suite is:

  • Forefront Client Security.
  • Forefront Client Security Management Console.
  • Forefront Security for Exchange Server.
  • Forefront Security for SharePoint.
  • Forefront Server Security Security Management Console.

All products are offered on a subscription basis although the Enterprise CAL (ECAL) suite includes the Forefront Security Suite with no extra licensing requirements.

The Exchange Enterprise CAL is also available to Select and Enterprise customers with services included, adding Forefront for Exchange Server and Exchange Hosted Filtering to the Exchange Enterprise CAL. This option is not available with retail licensing or to Open license customers) and must be taken up on a company-wide basis.

In the next part of this series, I’ll finally move on to take a look at the various methods that are available in order to buy Microsoft software.