How to stay current with Windows as a Service and Office 365 ProPlus

This content is 7 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

For many organisations, particularly those at “enterprise” scale, Windows and Office have tended to be updated infrequently, usually as major projects with associated capital expenditure. Meanwhile, operational IT functions that manage “business as usual” often avoid change because that change brings risks around the introduction of new technology that may have consequential effects. This approach is becoming increasingly untenable in a world of regular updates to software sold on a subscription basis.

This post looks at the impact of regularly updating Windows and Office in an organisation and how we need to modify our approach to reflect the world of Windows as a Service and “evergreen” Office 365?

Why do we need to stay current?

A good question. After all, surely if Windows and Office are working as required then there’s no need to change anything, is there? Unfortunately, things aren’t that simple and there are benefits of remaining current for many business stakeholders:

  • For the CIO: improved management, performance, stability and support for the latest hardware.
  • For the CSO: enhanced security against modern threats and zero-day attacks.
  • For end users: access to the latest features and capabilities for better productivity and creativity.

Every Windows release evolves the operating system architecture to better defend against attacks – not just patching! And Windows and Office updates support new ways of working: inking, voice control, improved navigation, etc.

So, updates are good – right?

How often do I need to update?

We’re no longer in a world of 5+5 years (mainstream+extended) support. Microsoft has publicly stated its intention to ship two feature updates to Windows each year (in Spring and Autumn). The latest of these is Windows 10 1803 (also known as Redstone 4), which actually shipped in April. Expect the next one in/around September 2018 (1809). Internally to Microsoft, there are new builds daily; and even publicly there are “Insider” Preview builds for evaluation.

That means that we need to stop thinking about Windows feature updates as projects and start thinking about them as process – i.e. make updating Windows (and Office, and supporting infrastructure) part of the business as usual norm.

OK, but what if I don’t update?

Put simply, if you choose not to stay up-to-date, you’ll build up a problem for later. The point about having predictable releases is that it should help planning

But each release is only supported for 18 months. That means that you need to be thinking about getting users on n-2 releases updated before it gets too close to their end of support. Today, that means:

  • Running 1703, take action to update.
  • Running 1709, plan to update.
  • Running 1803, trailblazer!

We’re no longer looking at major updates every 3-5 years; instead an approach of continuous service improvement is required. This lessens the impact of each change.

So that’s Windows, what about Office?

For those using Office 365 ProPlus (i.e. licensing the latest versions of Office applications through an Office 365 subscription), Windows and Office updates are aligned (not to the day, but to the Spring and Autumn cadence):

So, keep Office updated in line with Windows and you should be in a good place. Build a process that gives confidence and trust to move the two at the same time… the traditional approach of deploying Windows and Office separately often comes down to testing and deployment processes.

What about my deployment tools? Will they support the latest updates?

According to Microsoft, there are more than 100 million devices managed with System Center Configuration Manager (SCCM) and SCCM also needs to be kept up-to-date to support upcoming releases.

SCCM releases are not every 6 months – they should be every 4 months or so – and the intention is to update SCCM to support the next version of Windows/Office ahead of when they become available:

Again, start to prepare as early as possible – and think of this as a process, not a project. Deploy first to a limited set of users, then push more broadly:

Why has Microsoft made us work this way?

The world has changed. With Office existing on multiple platforms and systems under constant threat of attack from those who wish to steal our data (and money) it’s become necessary to move from a major update every 3-5 years to a continuous plan to remain in shape and execute every few months – providing high levels of stability and access to the latest features/functionality.

Across Windows, Office, Azure and System Center Microsoft is continually improving security, reliability and performance whilst integrating cloud services to add functionality and to simplify the process of staying current.

How can I move from managing updates as a project to making it part of the process?

As mentioned previously, adopting Windows as a Service involves a cultural shift from periodic projects to a regular process.

Organisations need to be continually planning and preparing for the next update using Insider Preview to understand the impact of upcoming changes and the potential provided by new features, including any training needs.

Applications, devices and infrastructure can be tested using targeted pilot deployments and then, once the update is generally available and known to work in the environment, a broader deployment can be instigated:

Aim to deploy to users following the model below for each stage:

  • Plan and prepare: 1%.
  • Targeted deployment: 9%.
  • Broad deployment: 90%.

Remember, this is about feature updates, not a new version of Windows. The underlying architecture will evolve over time but Windows as a Service is about smaller, incremental change rather than the big step changes we’ve seen in the past.

But what about testing applications with each new release of Windows?

Of course, applications need to be tested against new releases – and there will be dependencies on support from other vendors too – but it’s important that the flow of releases should not be held up by application testing. If you test every application before updating Windows, it will be difficult to hit the rollout cadence. Instead, proactively assess which applications are used by the majority of users and address these first. Aim to move 80-90% of users to the latest release(s) and reactively address issues with the remaining apps (maybe using a succession of mini-pilots) but don’t stop the process because there are still a few apps to get ready!

You can also use alternative deployment methods (such as virtualised applications or published applications) to work around compatibility issues.

It’s worth noting that most Windows 7-compatible apps will be compatible with Windows 10. The same app development platform (UWP), driver servicing model, etc. are used. Some device drivers may not exist for Windows 10 but most do and availability through Windows Update has improved for drivers and firmware. BIOS support is getting better too.

In addition, there are around a million applications registered in the Ready For Windows database, which can be used for spot-checking ISVs’ Windows 10 support for each application and its prevalence in the wild.

New cloud-enabled capabilities to guide your Windows 10 deployment

Windows Analytics is a cloud-based set of services that collects information from within Windows and provides actionable information to proactively improve your Windows  (and Office) environment.

Using Azure Log Analytics, Windows Analytics can advise on:

  • Readiness (Windows 10 Professional): planning and addressing actions for upgrade from Windows 7 and 8.1 as well as Windows 10 feature updates.
  • Compliance (Windows 10 Professional): for regular (monthly) updates.
  • Device health (Windows 10 Professional and Enterprise): assessing issues across estate (e.g. problematic device drivers).

OK, so I understand why I need to continuously update Windows, but how do I do it?

Microsoft recommends using a system of deployment rings (which might be implemented as groups in SCCM) to roll out to users in the 1% (Insider), 9% (Pilot) and 90% (Broad) deployments mentioned above. This approach allows for a consistent but controllable rollout.

Peer-to-peer download technologies are embedded in Windows that will minimise network usage and recent versions support express updates (only downloading deltas) whilst the impact on users can be minimised through scheduling.

When it comes to tools, there are a few options available:

  • Windows Update is the same service used by consumers to download updates at the rate governed by Microsoft.
  • Windows Update for Business is a version of Windows Update that allows an organisation to control their release schedule and set up deployment rings without any infrastructure.
  • Windows Software Update Services (WSUS) allows feature updates to be deployed when approved, and BranchCache can be used to minimise network impact.
  • Finally, SCCM can work with WSUS and offers Task Sequences, etc. to provide greater control over deployment.

What about the normal “Patch Tuesday” updates?

Twice-annual feature updates don’t replace the need to patch more regularly and Microsoft continues to release cumulative updates each month to resolve security and quality issues.

In effect, we should receive one feature update then five quality updates in each cycle:

Where can I find more information?

The following resources may be useful:

 

The contents of this post are based on a webcast delivered by Bruno Nowak (@BrunoNowak), Director of Product Marketing (Microsoft 365) at Microsoft.

Weeknote 16: Anonymous? (Week 17, 2018)

This content is 7 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

This week has been another one split between two end-user computing projects – one at the strategy/business case stage and another that’s slowly rolling out and proving that the main constraint on any project is the business’s ability to cope with the change.

I can’t say it’s all enjoyable at the moment – indeed I had to apply a great deal of restraint not to respond to lengthy email threads that asked “why aren’t we doing it this way”… but the inefficiencies of email are another subject, for another day.

So, instead of a recap of the week’s activities, I’ll focus on some experiences I’ve had recently with “anonymous” surveys. I’m generally quite cynical of these because if I have to log on to the platform to provide a response then it’s not truly anonymous – a point I highlighted to my colleagues in HR who ask a weekly “pulse” question. “It’s not on your record”, I was told – yet progress is logged against me (tasks due, tasks completed, etc.) and only accessible when I’m logged in to the HR system. It’s the same for SharePoint surveys – if I need to use my Active Directory credentials, then it’s not anonymous.

I’m approaching my third anniversary at risual and I picked up an idea for soliciting feedback (for my annual review) from colleagues, partners and customers from my colleague James Connolly, who has been using a survey tool for a couple of years now. Rather than use one of the tools on the wider Internet, like Survey Monkey or TypePad, I decided to try Microsoft Forms – which is a newish Office 365 capability. It was really simple to create a form (and to make it anonymous, once I worked out how) but what I’ve been most impressed with is the reporting, with the ability to export all responses to Excel for analysis, or to view either an aggregated view of responses or the detail on each individual response within Microsoft Forms.

I went to pains to make sure that the form is truly anonymous – not requiring logon, though I did invite people to leave their name if they were happy for me to contact them about the responses. Even so, with a sample size of around 50 people invited to complete the form and a 50% response rate, I can take a guess at who some of the responses are from. By the same token, there are others where I wish I knew who wrote the feedback so I could ask them to elaborate some more!

I won’t be doing anything with the results, except saying “this is what my colleagues and customers think of me and this is where I need to improve”, but it does re-enforce my thinking that very little in life is truly anonymous.

Next week includes a speaking gig at a Microsoft Modern Workplace popup event (though I’m not entirely comfortable with the demonstrations), more Windows 10 device rollouts and maybe, just maybe, some time to write some blog posts that aren’t just about my week…

UK Government Protective Marking and the Microsoft Cloud

This content is 7 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I recently heard a Consultant from another Microsoft partner talking about storing “IL3” information in Azure. That rang alarm bells with me, because Impact Levels (ILs) haven’t been a “thing” for UK Government data since April 2014. For the record, here’s the official guidance on the UK Government data security classifications and this video explains why the system was changed:

Meanwhile, this one is a good example of what it means in practice:

So, what does that mean for storing data in Azure, Dynamics 365 and Office 365? Basically, information classified OFFICIAL can be stored in the Microsoft Cloud – for more information, refer to the Microsoft Trust Center. And, because OFFICIAL-SENSITIVE is not another classification (it’s merely highlighting information where additional care may be needed), that’s fine too.

I’ve worked with many UK Government organisations (local/regional, and central) and most are looking to the cloud as a means to reduce costs and improve services. The fact that more than 90% of public data is classified OFFICIAL (indeed, that’s the default for anything in Government) is no reason to avoid using the cloud.

Adopting cloud services means being ready for constant change

This content is 7 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

There’s a news story today about how Microsoft may be repositioning some (or all) of Skype for Business as Microsoft Teams (the collaborative group-based chat service built on various Office 365 services but Skype for Business in particular).

The details of that story are kind of irrelevant to this post; it’s the reaction I got on Twitter that I felt the need to comment on (when I hit 5 tweeted replies I thought a blog post might be more appropriate).

Change is part of consuming cloud services. There’s a service agreement and a subscription/licensing agreement – customers consume the service as the provider defines it. The service provider will generally give notice of change but you normally have to accept it (or leave). There is no option to stay on legacy versions of software for months or years at a time because you’re not ready to update your ways of working or other connected systems.

That is a big shift and many IT departments have not adjusted their thinking to adopt this new way of working.

I’ve seen many organisations moving to cloud services (mostly Office 365 and Azure) and stick with their current approach. They do things like try to map drive letters to OneDrive because that’s what users are used to, instead of showing them new (and often better) ways of working. They try to use old versions of Office with the latest services and wonder why the user experience is degraded. They think about the on-premises workloads (Exchange, Lync/Skype for Business, SharePoint) instead of the potential provided by the whole productivity platform that they have bought licences to use. They try to turn parts of the service off or hide them from users.

My former colleague Steve Harwood (@SteeveeH) did some work with one of risual’s customers to define a governance structure for Office 365. It’s great work – and maybe I’ll blog about it separately – but the point is that organisations need to think differently for the cloud.

Buying services from Microsoft, Amazon, Google, Salesforce, et al is not like buying them from the managed services provider that does its best to maintain a steady state and avoid change at all costs (or often at great cost!). Moving to the cloud means constant change. You may not have servers to keep up to date once your apps are sold on an “evergreen” subscription basis but you will need to keep client software up to date – not just traditional installed apps but mobile apps and browsers too. And when the service gains a new feature, it’s there for adoption. You may have the ability to hide it but that’s just a sticking plaster solution.

Often the cry is “but we need to train the users”. Do you really? Many of today’s business end users have grown up with technology. They are familiar with using services at home far more advanced than those provided by many workplaces. Intuitive user interfaces can go a long way and there’s no need to provide formal training for many IT changes. Instead, keep abreast of the advertised changes from your service provider (for example the Message Center in Office 365) and decide what the impact is of each new feature. Very few will need a full training package! Some well-written communications, combined with self-help forums and updated FAQs at the Service Desk will often be enough but there’s also the opportunity to offer access to Massive Open Online Courses (MOOCs) where training needs are more extensive.

There are, of course, examples of where service providers have rolled out new features with inadequate testing, or with too little notice but these are edge cases and generally there’s time to react. The problem comes when organisations stick their proverbial heads in the sand and try to ignore the inevitable change.

Providing fast mailbox access to Exchange Online in virtualised desktop scenarios

This content is 7 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

In last week’s post that provided a logical view on end user computing (EUC) architecture, I mentioned two sets of challenges that I commonly see with customers:

  1. “We invested heavily in thin client technologies and now we’re finding them to be over-engineered and expensive with multiple layers of technology to manage and control.”
  2. “We have a managed Windows desktop running <insert legacy version of Windows and Office here> but the business wants more flexibility than we can provide.”

What I didn’t say, is that I’m seeing a lot of Microsoft customers who have a combination of these and who are refreshing parts of their EUC provisioning without looking at the whole picture – for example, moving email from Exchange to Exchange Online but not adopting other Office 365 workloads and not updating their Office client applications (most notably Outlook).

In the last month, I’ve seen at least three organisations who have:

  • An investment in non-persistent virtualised desktops (using technology products from Citrix and others).
  • A stated objective to move email to Exchange Online.
  • Office Enterprise E3 or higher subscriptions (i.e. the licences for Office 365 ProPlus – for subscription-based evergreen Office clients) but no immediate intention to update Office from current levels (typically Office 2010).

These organisations are, in my opinion, making life unnecessarily difficult for themselves.

The technical challenges with such as solution come down to some basic facts:

  • If you move your email to the cloud, it’s further away in network terms. You will introduce latency.
  • Microsoft and Citrix both recommend caching Exchange mailbox data in Outlook.
  • Office 365 is designed to work with recent (2013 and 2016) versions of Office products. Previous versions may work, but with reduced functionality. For example, Outlook 2013 and later have the ability to control the amount of data cached locally – Outlook 2010 does not.

Citrix’s advice (in the Citrix Deployment Guide for Microsoft Office 365 for Citrix XenApp and XenDesktop 7.x) is using Outlook Cached Exchange Mode; however, they also state “For XenApp or non-persistent VDI models the Cached Exchange Mode .OST file is best located on an SMB file share within the XenApp local network”. My experience suggests that, where Citrix customers do not use Outlook Cached Exchange Mode, they will have a poor user experience connecting to mailboxes.

Often, a migration to Office 365  (e.g. to make use of cloud services for email, collaboration, etc.) is best combined with Office application updates. Whilst Outlook 2013 and later versions can control the amount of data that is cached, in a virtualised environment, this represents a user experience trade-off between reducing login times and reducing the impact of slow network access to the mailbox.

Put simply: you can’t have fast mailbox access to Exchange Online without caching on virtualised desktops, unless you want to add another layer of software complexity.

So, where does that leave customers who are unable or unwilling to follow Microsoft’s and Citrix’s advice? Effectively, there are two alternative approaches that may be considered:

  • The use of Outlook on the Web to access mailboxes using a browser. The latest versions of Outlook on the Web (formerly known as Outlook Web Access) are extremely well-featured and many users find that they are able to use the browser client to meet their requirements.
  • Third party solutions, such as those from FSLogix can be used to create “profile containers” for user data, such as cached mailbox data.

Using faster (SSD) disks for XenApp servers and improving the speed of the network connection (including the Internet connection) may also help but these are likely to be expensive options.

Alternatively, take a look at the bigger picture – go back to basics and look at how best to provide business users with a more flexible approach to end user computing.

Finding the PlanId for a Microsoft Planner Plan

This content is 7 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Yesterday, I wrote about creating Microsoft Planner tasks from email using Microsoft Flow. At the time, my flow wasn’t quite working because for some reason Flow wouldn’t pull through the details of all of my plans.  I even deleted and recreated a plan but Flow would only show me one. And entering a Custom Value with the name of my plan in my flow resulted in a Schema error for field PlanId in entity Task: Field failed schema validation.

That was, until I found a very useful nugget of information in the PowerApps Community forums. To find the PlanId, open the corresponding Plan in a browser and the last part of the URL contains the PlanId:

Finding the PlanID for a Microsoft Planner Plan

Put that into your flow and the corresponding list of BucketIds should then be visible:

Bucket Id located based on the Plan Id

Now my flow runs and puts the plain-text contents of an email into the subject of a new task. Unfortunately, I’m still working on how to populate other fields in the task and I think I may have hit the current limits of the Microsoft Flow-Planner integration.

Creating Microsoft Planner tasks from email using Microsoft Flow

This content is 7 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Work is pretty hectic at the moment. To be honest, that’s not unusual but scanning through tweets at lunchtime or at the start/end of the day is not really happening. I tend to take a look in bed (a bad habit, I know) and often think “that looks interesting, I’ll read it tomorrow” or “I’ll retweet that, but in the daytime when my followers will see it”.  At the moment, my standard approach is to email the tweets to myself at work but, 9 times out of 10, they just sit in my Inbox and go no further.

So, I thought I’d set up a Kanban board in Microsoft Planner for interesting tweets (I already have one for future blog posts). That’s pretty straightforward but one of the drawbacks with Planner is that you can’t email tasks to the plan. That’s a pretty big omission in my view (and it seems I’m not alone) as I believe it’s something that can be done in Trello (which is the service that Planner is trying to compete with).

I got thinking though, one of the other services that might help is Microsoft Flow. What if I could create a flow to receive an email (in my own mailbox) and then create an item in a plan, then delete the email?

The first challenge was receiving the email. I set up a new email alias for my account but my interestingtweets@markwilson.it wouldn’t trigger the flow, because it’s a secondary address.

So, I switched to looking for a particular string in the subject of the email. That worked. But creating an item in the plan was failing with a “Bad Request” error. I took a look at the advice for troubleshooting a flow and, digging a little deeper showed the failure message of Schema error for field Assignments in entity Task: Field failed schema validation. That was because I was using dynamic content to assign the task to myself (so I removed that setting).

This left me with a different message: Schema error for field Title in entity Task: Field failed schema validation. That turned out to be because I was using the message body as the title of the email and Planner was only happy if I sent it as plain text (not as HTML). I can convert the HTML to plain text in Flow, but the multi-line content still fails validation…

So far, I’ve been able to successfully create tasks from single-line emails in one of my Plans but not in the one I created for this purpose (it’s not appearing as a target and if I enter the name manually the flow fails with a message of Schema error for field PlanId in entity Task: Field failed schema validation“)… I’ve made the plan publicly visible, so I’ll wait and see if that makes a difference (it hasn’t so far). If not, I may need to remove and recreate the Plan.

So near, yet so far. And ideally, I’d be able to do something more intelligent with the task items (like to read links from the email and add them as links to the task in Planner) – maybe what I want is too much for Flow and I need to use a Logic App instead.

At the moment, this is what my Flow looks like:

Microsoft Flow to create a task in Microsoft Planner from an email

When I have it working with marking the email as read, I’ll change it over to deleting the email instead – after all, I don’t need an email and a task in Planner!

Outlook gotcha: only cached data is exported to data file (.PST)

This content is 8 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

This weekend, a family project that required its own mailbox ended, meaning I could reduce the number of licences in my Exchange Online subscription. That’s straightforward enough but I wanted to take a backup copy of the email before cutting the mailbox loose.

From the last time I did any Exchange Online administration, I recalled that one of the limitations was that you can’t back up a mailbox to a PST from PowerShell. That may have changed but the advice at the time was to backup to an Outlook data file (also known as a Personal Folder) in Outlook. It’s clunky but at least it’s functional.

I couldn’t work out why not all of the data was being exported; only the items that were cached and not the ones that appeared if I clicked on “There are more items in this folder on the server/click here to view more on Microsoft Exchange”. Then I found a clue in a Spiceworks post from Joe Fenninger, where Joe says “Dont [sic] forget to download all [Office 365] content prior to export.”.

I needed to adjust the cached mode settings for the mailbox to change how much email is kept offline, after which Outlook could export all items to the Outlook Data File, rather than just the ones that were cached locally.

Securing the modern productive enterprise with Microsoft technology

This content is 8 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

“Cybercrime costs projected to reach $2 trillion by 2019” [Forbes, 2016]

99: The median number of days that attackers reside within a victim’s network before detection [Mandiant/FireEye M-Trends Report, 2017]

“More than 63% of all network intrusions are due to compromised user credentials” [Microsoft]

The effects of cybercrime are tremendous, impacting a company’s financial standing, reputation and ultimately its ability to provide security of employment to its staff. Nevertheless, organisations can protect themselves. Mitigating the risks of cyber-attack can be achieved by applying people, process and technology to reduce the possibility of attack.

Fellow risual architect Tim Siddle (@tim_siddle) and I have published a white paper that looks at how Microsoft technology can be used to secure the modern productive enterprise. The tools we describe are part of Office 365, Enterprise Mobility + Security, or enterprise editions of Windows 10. Together they can replace many point solutions and provide a holistic view, drawing on Microsoft’s massive intelligent security graph.

Read more in the white paper:

Securing the modern productive enterprise with Microsoft technology

My first PowerApps app – a business mileage recorder

This content is 8 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

In common with many people who travel for work, I keep a record of my journeys so that I can claim mileage expenses. For the last couple of years, that record has been a spiral-bound notebook (for driving) and Strava (for cycling) – though I haven’t actually claimed any mileage for cycling yet! I wanted to replace my analogue system with a smartphone app and, following a conversation a few weeks ago with my colleague Brian Cain (@BrianCainUC), I decided to create something using Microsoft PowerApps.

For those who are unfamiliar with PowerApps, it’s a technology solution provided by Microsoft to help normal business users – people who are not developers – to create simple applications to connect systems and data. The resulting apps can run on mobile devices, as well as on Windows 10.

PowerApps is available in my Office 365 subscription (though I think there are other ways to sign-up too) and I set to work creating my Mileage Recorder. A few minutes later I had something functional. Not long after that I had tweaked it to be pretty much what I needed. So I created an app in less than 30 minutes and it’s taken me three weeks to write this blog post! Hmm…

Creating my first PowerApps app

My app is a simple three-screen app – taking a table in an Excel Workbook from OneDrive for Business as a data source. PowerApps recognised the data types in the columns of the table and formatted accordingly, then I tweaked things a little in PowerApps Studio.

PowerApps Mileage Recorder: Home PowerApps Mileage Recorder: View PowerApps Mileage Recorder: Edit

I haven’t looked in detail at the architecture used by PowerApps but essentially the PowerApps app provides a native OS wrapper for any apps that I create. This means my app will work on any platform where PowerApps is supported.

PowerApps Mileage Recorder

I can also create a direct link to the app on my phone’s home screen but the look and feel is one of a PowerApps app – not a native application. None of that is an issue – if I want more complex cross-platform apps then someone who can cut code (not me!) can use Xamarin – but for a simple app, PowerApps seems to do the job.

PowerApps/Mileage Recorder on iOS Home Screen

The PowerApps documentation helped me out a lot – and these were the tutorials I found most useful:

There’s also a useful Q&A on using PowerApps within an organisation.

I did have some challenges worth noting but none are show-stoppers:

  • The Windows 10 smartphone that I use for work doesn’t meet the PowerApps hardware requirements, which is a little bizarre. So, I needed to use the app on my personal iPhone. I had created my PowerApp using my employer’s Office 365 tenant and a data source in my work OneDrive but I also use the Outlook app on iOS to connect to my personal Office 365 tenant. This combination was causing challenges that required re-authentication. I couldn’t find an easy way to move the app between tenants (though I’m sure there is one) so I moved the data source to my own tenant and recreated the PowerApp. I’m pretty sure that there must be a proper way to import and export apps, I just haven’t found it yet!
  • The web version of the PowerApps Studio seems a bit flaky at times but it is still a preview. Installing the Universal Windows Platform (UWP) version on a Windows 10 PC worked flawlessly though, even without any admin rights on my company Surface.
  • I couldn’t work out how to make a date and time field work as a simple date field. I really don’t need to record the time of my journeys – just the date!
  • PowerApps doesn’t support formulae in Excel workbooks. Instead, I had to apply some logic in the app to calculate the miles travelled, which displays in my app but doesn’t get written back to the data source. I’m pretty sure that’s fixable – I just haven’t worked out how, yet…

Is it really a good idea to let users create their own apps?

In my customer conversations, it’s quite common to hear IT people saying they don’t want their users creating PowerApps. I can see why – after all, we’ve all seen Access databases and Excel spreadsheets become “business-critical applications” that then create issues for the IT department. For what it’s worth, my view is that if something is really business critical, the business will invest resources into developing something that’s properly supportable. If it doesn’t reach that bar, then it’s not a business-critical app – and why would you prevent users from generating their own tools that help them to work more effectively, albeit unsupported by corporate IT?

To put it another way, people will do what they need to do to get things done, with or without IT’s blessing – so why not give them the tools to do things in a manner that integrates well with existing (supported) applications and services?

I’ll be at Microsoft tomorrow, attending a training event around PowerApps and Flow. That should give me a good opportunity to build on the experience from creating my Mileage Recorder. Together with PowerBI (something else I really need to learn more about) these technologies provide a trilogy of tools to empower users to do more with data. And on that note, I should probably end this blog post, as I’m starting to sound like a Microsoft marketing representative…