Monthly retrospective: January 2025

Last year I tried a thing – another attempt at weeknotes. Weeknotes became monthly retrospectives. Monthly retrospectives sometimes became every two months… and then they dried up completely last summer. I’m sorry. I was busy and, to be honest, this blog is not as important to me as it once was.

But then, an anonymous commenter said that they miss them and asked me to fill the gap to the end of 2024. That might happen (or it might join the great list of unwritten blog posts in the sky), but let’s have another go at the present. So, 31 January, 2025. Monthly retrospective…

At work

Things have really stepped up a gear at work. Last year I started to work on a future vision around which the Office of the CTO could structure its “thought leadership” content. Some important people supported it and I found myself co-presenting to our executive board. The next steps will remain confidential, but it’s no bad thing for me. And, the follow-on work has given me a lot of exposure to some of the marketing activities – my last fortnight has been full of market analysis and ideal client profiles.

But the last fortnight was not just those things. I had the hairbrained idea that, as productivity is is one of the outcomes we seek for our clients, maybe we should “do something for National Productivity Week”. After writing a series of blog posts (see below), and a fun day recording video content with our brand team, it feels like a one-man social media takeover. In fact, we had so much content that some of it will now have to go out next week. But that’s OK – productivity is not just for one week of the year. These are the posts that are live on the Node4 website today:

And the last post, next week, will be about building sustainable productivity approaches.

There are also a couple of videos up on LinkedIn:

And, earlier in the month (actually, it sneaked out on YouTube before Christmas but I asked for it to be pulled for an edit), there was this one. Not my best work… but it did lead to the purchase of a teleprompter which has made later videos so much easier!!!

Learning

Also on the work front, this month I completed my ILM Level 5 Award in Leadership and Management. Node4 runs this as part of a 7-month programme of workshops, with two coursework assignments that relate to four of the workshops. Over the last 7 months, I’ve covered:

  • Developing your personal leadership brand.
  • Inclusive leadership and motivation skills.
  • Managing and implementing strategic change.
  • Developing a High-performance team culture.
  • Manager as a coach.
  • Personal impact and emotional intelligence.
  • High impact presentations.

At home

Home Automation

I bought myself a Shelly temperature and humidity monitor for the Man Cave. It’s Home Assistant compatible, of course, so lets me use cheap overnight energy to stop the cabin from getting too cold/damp.

Also on the home automation front, I picked up some cheap Tapo P100 smart plugs. Like my no-name Chinese ESP32-based plugs, they are a better form factor than my older Kasa HS100/110 plugs so they don’t take space from the adjacent socket. But they lack any kind of reporting for energy usage so I should have got a pack of the slightly more expensive P110 models instead. I also struggled to add them to Home Assistant. They were recognised but wouldn’t authenticate, unless I reset my TP-Link password (which seemed to be the workaround – even if the password was the same)!

Getting away from it all

Aside from the tech, Mrs Wilson and I got away to London for a weekend, to celebrate a friend’s birthday. We were almost blown away by the tail of Storm Éowyn at Primrose Hill viewpoint but had fun (I’d never been before, but it’s in so many films!).

Tomorrow, I’m off to France for the UCI Cyclocross World Championships. Not competing of course (and disappointed that British Cycling is not sending a Women’s team or an U23 Men’s team). Just spectating. And probably consuming quite a lot of beer. And frites.

Writing

There have been some personal blog posts this month too:

In pictures

Some snaps from my Instagram:

Digital transformation is only as good as the supporting processes

Earlier today, I received a penalty charge notice. I’d dropped my son at the airport a couple of weeks ago – and thought my car was registered to auto-pay the Heathrow terminal drop-off charge. Apparently it’s not, because APCOA wrote to me demanding £80 for my mistake, reduced to £40 if I paid within 14 days (five of which had already passed because they used a slow postal service). Hey ho. It was a mistake. One which I’ll hopefully not make again. It’s annoying though.

It reminded me of another letter on my desk. You see my confusion about autopayment came about because I do have AutoPay set up for the various London charges – Congestion Charge, ULEZ, Dartford Crossing. All the Transport for London (TfL) ones but not Heathrow Airport, it would seem…

A mystery charge based on flawed process

Last month, I was checking the transactions on my credit card and I spotted an extra charge to TfL. It seemed strange so I logged on to my account. My son had driven into the ULEZ one day (which I knew about and expected the charge for), but there was another charge he didn’t recognise.

Our car’s registration is KU07 ABC (it’s not really, but I’ve changed the details enough to tell the story, without publishing personal information). When I checked my online account, it showed a picture of KO07 ABC. But the ANPR had identified it as KD07 ABC. KD07 ABC is not a valid registration, so somewhere, either a human or an AI had decided that the charge should be allocated to our car. I suspect it was either based on the fact that our car had been driven in the ULEZ zone previously, or because someone has to check these things manually and they get very, very bored. Regardless, our Volkswagen Golf was not the Seat Ibiza in the photo.

The cheque’s in the post

I contested the charge and was pleased to get an email a few days later that confirmed my complaint had been upheld, based on the evidence provided (TfL’s own photos from my account). But the part that amused me was this – the refund for this highly automated digital charging process was to be sent in the form of a cheque.

So, I have a very analogue cheque for £12.50, to pay into my account (as it’s 2025, I shall do this with a digital photo), but all as the result of a digital process that doesn’t quite work…

Postscript

A couple of days after I wrote this post, my Nectar card was used fraudulently. Someone managed to spend 1000 points (I think that’s £5) but it wasn’t me.

I contacted Nectar, who stopped the card and will issue another. But the process I had to go through was terrible. Before I could start an online chat session I needed to provide name, card number, and various other details. Then I reached an awful chat interface using Oracle software, which worked in a mobile browser but was in no way optimised for the screen I was using.

The agent then proceeded to ask me for the same details I had already provided. By this point I was very nervous about phishing attempts and reluctant to provide any information. It turned out to just be a shockingly bad process.

Featured image: Traffic sign image licensed under the Open Government Licence version 1.0 (OGL v1.0).

A few thoughts on the UK Government’s AI announcement

Most of the text in this post previously appeared on my LinkedIn feed. I thought it should have been here…

Sometimes, I read something on LinkedIn and repost or comment, before realising I’ve pretty much written an entire blog post. On my phone. Twice, because I navigated away and lost the first attempt. Maybe I should have put in here, but it probably gets seen by more people on LinkedIn. Still, I own this platform, so I’m putting it up for posterity.

The post in question was one from the BBC’s Technology Editor, Zoe Kleinman. Zoe had posted insights about the UK Prime Minister’s “bold and ambitious plans to support the UK’s AI sector”.

Zoe’s post and articles are well worth a read, but I wanted to add some more:

“[…] I can see why the UK wants to position itself as an innovative place for growth, without being (quite as) reliant on US tech behemoths, but most of us have yet to establish what we want to use AI for.

Sure, “AI” is the perceived answer to everything at the moment – and there are some very large companies with very deep pockets pouring billions into “AI” – but it’s an arms race. “Big tech” hasn’t worked out how to make money from its AI investments yet. The tech giants just want to make sure they have a big slice of that pie when we do finally get there.

Putting aside the significant environmental and social challenges presented by AI (as mentioned in Zoe’s post […]), “we” (our companies and our countries) haven’t got a solid business case. We just know we can’t afford to be left behind…

We’ve used some AI technologies in a variety forms for years (for example Machine Learning) – and the recent advances in generative AI (genAI) have democratised access to AI assistants and opened a huge opportunity. But genAI is just one type of AI, and we don’t fully understand the large language models that underpin it.

One thing that sticks in my mind is something I heard on a recent podcast, when Johannes Kleske commented something along the lines of “when it’s in the future, it’s AI. Once we have worked out what to do with it, it’s just software.”

More on the UK Prime Minister’s AI announcement

Artificial Intelligence: Plan to ‘unleash AI’ across UK revealed [BBC News]

Are you in the UK and looking at using Apple AirPods Pro 2 as hearing aids? Read this first!

I’m sorry for the clickbait headline, but the urgency is real, because I’m seeing people making purchasing decisions based on a technical feature that’s not available in the UK yet.

If you’re a middle-aged man or woman, you may have noticed that it’s difficult to hear people in some social situations. I certainly have, and so have some of my friends. Generally in pubs and bars with hard surfaces and lots of background noise.

I tell myself that I need to get a professional hearing test. I keep trying at Specsavers when I have my eyes tested but have struggled with appointment availability. And anyway, it’s not that bad. Plus I don’t have a couple of thousand pounds ready for buying hearing aids.

Apple is bringing Hearing Health capabilities to the masses

When I heard that Apple AirPods Pro 2 have hearing aid capabilities, I was very interested. A consumer tech device that might help me in those limited circumstances when I need to wear a hearing aid, without the financial outlay.

It’s been possible to create an audiogram and use it with your AirPods (or other headphones) for a while, but there’s lots of excitement as Apple is bringing Hearing Health capabilities natively to the iPhone with iOS18 and with AirPods Pro 2. But, if you’re in the UK, you might want to hold off…

Here’s the problem: AirPods Pro 2 do not yet have regulatory approval as hearing aids in the UK.

They do in many other countries, but not here. Not at the time of researching this post in late-November 2024. But there is a global website, and a global ad campaign. Apple even says in the notes for this ad that:

“The Hearing Test and Hearing Aid features are regulated health features that require approval and will be offered after authorization is received. Feature availability varies by region”

Unfortunately, I’ve seen people (including those with profound hearing loss) saying they will ask Santa for some AirPods Pro for Christmas, based on advertising this feature.

So, what can you do?

  1. Firstly, and I rarely give this advice to anyone, turn off automatic updates. Do not let your iPhone update to iOS 18.x. Manually apply updates for 17.x. Of course, that means you won’t get other iOS18 goodness either, but Apple Intelligence isn’t available in the UK yet either (like the Heading Aid feature, it’s “coming soon”).
  2. Then, download the Mimi app, find a quiet space and carry out a hearing test. Follow these instructions to save the audiogram to Apple Health and set up the Headphone Accomodations for your AirPods. Basically, you can get some what Apple will bring to the UK, but only with older operating systems that don’t have the Apple capabilities built in (and turned on for other regions).
  3. Finally, keep an eye on the Apple website. This is the page that has the details on regional availability for the Apple Hearing Heath features.

One more thing

The new Hearing Health features are for Apple AirPods Pro 2. I checked mine: they are listed on my receipt as “AirPods Pro (2nd generation)”. Is that the same thing? The short answer is “yes”, but it took me a while to get that information.

I had an infuriating online chat with Apple Support, who seemed incapable of understanding my question, despite me providing serial numbers and product codes. Thankfully, I also found an Apple support article, which gave me the answer (yes). Mine are model number A3048 which is now called “AirPods Pro 2 with MagSafe Charging Case (USB-C)”. Why can’t they just say “the marketing folks changed the name”?

Featured image by Miguel Angel Avila on Unsplash.

Microsoft Ignite 2024 on a page

You probably noticed, but Microsoft held its Ignite conference in Chicago last week. As is normal now, there’s a “Book of News” for all the major announcements and the keynotes are available for online review. But there’s an awful lot to sort through. Luckily, CNET created a 15 minute summary of Satya Nadella’s keynote:

Major announcements from Ignite 2024

Last year, I wrote about how it was clear that Microsoft is all about Artificial Intelligence (AI) and this year is no different. The rest of this post focuses on the main announcements with a little bit of analysis from yours truly on what the implications might be.

AnnouncementWhat it meansFind out more
Investing in security, particularly around Purview.Data governance is of central importance in the age of AI. Microsoft has announced updates to prevent oversharing, risky use of AI, and misuse of protected materials. With one of the major concerns being accidental access to badly-secured information, this will be an important development, for those that make use of it.https://aka.ms/Ignite2024Security/
Zero Day QuestA new hacking event with $4m in rewards. Bound to grab headlines!https://aka.ms/ZeroDayQuest
Copilot as the UI for AIIf there’s one thing to take away from Ignite it’s that Microsoft sees Copilot as the UI for AI (it becomes the organising layer for work and how it gets done).

1. Every employee will have a Copilot that knows them and their work – enhancing productivity and saving time.
2. There will be agents to automate business processes.
3. And the IT dept has a control system to manage secure and measure the impact of Copilot.
Copilot ActionsCopilot Actions are intended to reduce the time spent on repetitive everyday tasks – they were described as “Outlook Rules for the age of AI” (but for the entire Microsoft 365 ecosystem). I’m sceptical on these but willing to be convinced. Let’s see how well they work in practice.https://aka.ms/CopilotActions
Copilot AgentsIf 2023-4 were about generative AI, “agentic” computing is the term for 2025.

There will be Agents within the context of a team – teammates scoped to specific roles – e.g. a facilitator to keep meeting focus in Teams and manage follow-up/action items; a Project Management Agent in Planner – to create a plan and oversee task assignments/content creation; self-service agents to provide information – augmenting HR and IT departments to answer questions and complete tasks; and a SharePoint Agent per site – providing instant access to real-time information.

Organisations can create their own agents using Copilot Studio – and the aim is that it should be as easy to create an Agent as it is to create a document.
https://aka.ms/AgentsInM365
Copilot AnalyticsAnswering criticism about the cost of licensing Copilot, Microsoft is providing analytics to correlate usage to a business metric. Organisations will be able to tune their Copilot usage to business KPIs and show how Copilot usage is translating into business outcomes.https://aka.ms/CopilotAnalytics
Mobile Application Management on Windows 365Microsoft is clearly keen to push its “cloud PC” concept – Windows 365 – with new applications so that users can access a secure computing environment from iOS and Android devices. Having spent years working to bring clients away from expensive thin client infrastructure and back to properly managed “thick clients”, I’m not convinced about the “Cloud PC”, but maybe I’m just an old man shouting at the clouds…https://aka.ms/WindowsAppAndroid
Windows 365 LinkWindows 365 Link is a simple, secure purpose built access device (aka a thin PC). It’s admin-less and password-less with security configurations enabled by default that cannot be turned off. The aim is that users can connect directly to their cloud PC with no data left locally (available from April 2025). If you’re going to invest in this approach, then it could be a useful device – but it’s not a Microsoft version of a Mac Mini – it’s all about the cloud.https://aka.ms/Windows365Link
Windows Resiliency InitiativeDoes anyone remember “Trustworthy Computing”? Well, the Windows Resiliency Initiative is the latest attempt to make Windows more secure and reliable. It includes new features like Windows Hotpatch to apply critical updates without a restart across an entire IT estate. https://aka.ms/WinWithSecurity
Azure LocalA rebranding and expansion of Azure Stack to bring Azure Arc to the edge. Organisations can run mission critical workloads in distributed locations.https://aka.ms/AzureLocal
Azure Integrated HSMMicrosoft’s first in-house security chip hardens key management without impacting performance. This will be part of every new server deployed on Azure starting next year.https://aka.ms/AzureIntegratedHSM
Azure BoostMicrosoft’s first in-house data processing unit (DPU) is designed to accelerate data-centric workloads. It can run cloud storage workloads with 3x less power and 4x the performance.https://aka.ms/AzureBoostDPU
Preview NVIDIA Blackwall AI infrastructure on AzureBy this point, even I’m yawning, but this is a fantastically fast computing environment for optimised AI training workloads. It’s not really something that most of us will use.https://aka.ms/NDGB200v6
Azure HBv5Co-engineered with AMD, this was described as a new standard for high performance computing and cited as being up to 8 times faster than any other cloud VM.https://aka.ms/AzureHBv5

FabricSQL Server is coming natively to Fabric in the form of Microsoft Fabric Databases. The aim here is to simplify operational databases as Fabric already did for analytical requirements. It provides an enterprise data platform that serves all use cases, making use of open source formats in the Fabric OneLake data lake. I have to admit, it does sound very interesting, but there will undoubtedly be some nuances that I’ll leave to my data-focused colleagues.https://aka.ms/Fabric
Azure AI FoundryDescribed as a “first class application server for the AI age” – unifying all models, tooling, safety and monitoring into a single experience, integrated with development tools as a standalone SDK and a portal. 1800 models in the catalogue for model customisation and experimentation.https://aka.ms/MaaSExperimentation
https://aka.ms/CustomizationCollaborations
Azure AI Agent ServiceBuild, deploy and scale AI apps to automate business processes. Compared with Copilot Studio for a graphical approach, this provides a code-first approach for developers to create agents, grounded in data, wherever it is.https://ai.azure.com/
Other AI announcementsThere will be AI reports and other management capabilities in Foundry, including including evaluation of models.

Safety is important – with tools to build secure AI including PromptShield to detect/block manipulation of outputs and risk/safety evaluations for image content.
Quantum ComputingThis will be the buzzword that replaces AI in the coming years. Quantum is undoubtedly significant but it’s still highly experimental. Nevertheless, Microsoft is making progress in the Quantum arms race, with a the “World’s most powerful quantum computer” with 24 logical Qubits, double the previous record.https://aka.ms/AQIgniteBlog

Featured image: screenshots from the Microsoft Ignite keynote stream, under fair use for copyright purposes.

Putting AI to work: making content more accessible

I’m really struggling with AI right now. On the one hand, it’s infuriating when it doesn’t help me with the task or activity that I prompt it to, and hallucinates total garbage. I’m also concerned for humanity, as the blind faith that some have in AI seems to me to be pretty dangerous. But, on another level, if we strip away the overenthusiastic promises, and look at the little things that are now embedded in our productivity tools, it can be quite impressive.

Multiple takes

Today, I recorded a short video for an internal SharePoint site that supports some work I’m doing around Node4’s 2030 vision and strategy. It took about 8 takes and a lot of wrangling with technology for a simple talking head in front of a background. But I got there and, after an hour or so, I had the minute or so of content I wanted. I didn’t use professional video tools – just the things I have in Microsoft 365 and Windows 11 – I recorded myself in Teams against a custom background I’d created in PowerPoint, used Clipchamp to trim the video, and uploaded to Stream.

Copilot has entered the room

Then the cool stuff started. Microsoft Stream generated captions from the video. They weren’t quite right – but the edits were minimal and it let me make them. That means that I now have a working transcript and closed captions – assistive technologies to aid viewers.

But the impressive part was when I asked Copilot to summarise the content. This is what it came up with.

Key Topics:

  • Introduction to Vision2030: Mark Wilson, one of the Technology and Innovation Directors in Node4’s Office of the CTO, introduced Vision2030, a forward-looking strategy that outlines key trends and technologies expected to define the future of mid-market businesses. 0:00
  • Focus of Vision2030: Mark Wilson explained that Vision2030 focuses on integrating emerging innovations across all technology verticals in which Node4 operates, aiming to empower clients to anticipate and adapt to future technology trends, navigate complexity, and increase competitiveness and resilience. 0:21
  • Creating Lasting Value: Mark Wilson emphasized that Vision2030 is not just about adopting new technologies but also about helping clients create lasting value. 0:39
  • Future Content and Sales Plays: Mark Wilson mentioned that over the coming weeks and months, Node4 will create more content that builds on the core theme of enabling productivity and directly links to their sales plays, ensuring a clear connection from Vision through go-to-market to delivery. 0:45
  • Commitment to Mid-Market Businesses: Mark Wilson reiterated Node4’s commitment to guiding mid-market businesses through the transformation, ensuring they are well-positioned to succeed in the future digital economy. 0:57

Spot on. Sure, it has an edited transcription to work from but now my colleagues don’t even need to watch the video. (Which begs the question why I recorded it in the first place – to which the answer is choice).

Changing the medium

So now, lets take this a bit further… out of Copilot and Stream and into the real implications of this technology. Starting with a couple of observations:

  • When I’m driving, Apple CarPlay reads my messages to me. Or, I ask Siri to send a message, or to take a note.
  • When I’m in a group messaging situation, some people seem to have a propensity to create short form audio.

I used to think that WhatApp voice messages are the spawn of the devil. Why should I have to listen to someone drone on for 30 seconds when I could read a text message much more quickly? Is it because they couldn’t be bothered to type? Then someone suggested it might be because they struggle with writing. That puts a different lens on things.

Create and consume according to our individual preferences

Now, with technologies like this we can create content in audio/video or written form – and that same content can be consumed in audio/video or written form. We can each use our preferred methods to create a message, and the recipient can use their preferred method to consume it.

This is the stuff that really makes a difference – the little things that make someone’s life easier – all adding up to a bigger boost in our individual productivity, or just getting things done.

Featured image – author’s own screenshot

Put the big rocks in first

This post previously appeared on my LinkedIn feed. I thought it should have been here…

A few weeks ago, I heard Michelle Minnikin refer to “big rocks first” on the WB-40 Podcast. It rang a bell with me – an approach to prioritising activities – first the big rocks, then the pebbles, then the sand. First attributed to Stephen Covey, it’s based on the story of a professor demonstrating to his class that they need to focus on the important things first, in order of priority, and then fit the minutiae of life around them. I’ve linked a 2 minute video at the end of this post that tells the story.

It seems I’ve used the analogy a lot recently – firstly helping someone manage the things that are making them anxious; now it seems that I’ll be doing the same with my son, in terms of prioritising activities to prepare for his A-Levels; and it works in a business context too – in terms of setting goals to achieve strategic aims.

So, whether it’s helping with mental health, learning about time management, or simply determining the priorities to achieve success, think about your rocks, pebbles, and sand.

And for a slightly longer (and older) video, here’s a practical demonstration featuring Stephen Covey himself:

Generative AI is just a small part of the picture

This post previously appeared on my LinkedIn feed. I thought it should have been here…

They say that, when all you have is a hammer, every problem that needs solving looks like a nail. Well, something like that anyway. Generative AI (GenAI) is getting a lot of airtime right now, but it’s not the answer to everything. Want a quick draft of some content? Sure, here it is – I’ve made up some words for you that sound like they could work. (That is literally how an LLM works.)

On the other hand, I spent yesterday afternoon grappling with Microsoft Copilot as it gave me lots of credible sounding information… with sources that just don’t exist, or don’t say the things it says they do. That’s quite frightening because many people will just believe the made-up stuff, repeat it and say “I got it from Copilot/ChatGPT/insert tool of choice”.

Anyway, artificial intelligence (AI) is more than just GenAI – and last night I watched this video from Eric Siegel. Once all the hype about GenAI has died down, maybe we’ll find some better uses for other AI technologies like predictive AI. One thing is for sure, Artificial General Intelligence (AGI) is not coming any time soon…

The enshittification of technical support information

I apologise to anyone inadvertently offended with the title of this post, but if the BBC’s Technology Editor can use the word “enshittification” as an example of retrograde progress, then so will I. It was also the American Dialect Society’s “Digital Word of the Year” in 2023, so that seems fair.

You see, I’m writing this on my phone, as I wait for my PC to be rebuilt onto a new corporate setup (and as the US Election results come in, which have not lightened my mood).

One of the issues I had whilst preparing to wipe all my data was a OneNote export process which said:

“OneNote cannot pack these sections because some sections have not yet been fully synced and could not be downloaded. These pages will not be saved. Continue anyway? Yes | No”

Great. It’s probably some embedded media that no longer exists… but that message is not very helpful for identifying a problem. There are hundreds of pages of notes across about a hundred sections. Which pages won’t be saved?

Still it could be worse. There’s plenty of cutesy messages these days that say something along the lines of:

“Oops, something went wrong”.

Nice. What went wrong? Can you give me a clue please so I might be able to find a resolution to the issue?

A brief history of technical support resources

When I started working in IT, we didn’t have the World Wide Web. Admittedly, we didn’t have as much information at our fingertips either but IT support often required a subscription to a database of knowledge articles. These were created by the software vendor and were reliable. At ICL we had something called PC-PARIS (PARIS was the support knowledge base). Microsoft had TechNet subscriptions with support information and updated software. Both of these were distributed to subscribers in CD-ROM format. This was a time when a typical Internet connection ran at around 33Kbps.

Errors had codes. We could look up what they meant and identify the cause. Not now. What does “[your application] ran into a problem and had to stop” mean?

As the web took off, so did blogs. People like me posted answers to problems we found. Then big websites started SEOing their way to the top and user-submitted content was the way forwards. Some of that content was good. Some of it less so. And we’ve all seen that geek on an online forum who doesn’t answer the question but instead provides a terse comment as to why the thing being asked is totally wrong because why would anyone do it that way anyway?

AI FTW?

Now we have AI answering the questions. Google or Bing will insert an AI-generated response at the top of the search results and, sometimes it’s helpful. But it’s an LLM – it literally makes stuff up based on probability. Some of that stuff is just wrong.

I may just be a grumpy old man, but I really just want detailed problem reports, accurate analysis, and reliable search results. Because not all progress is good progress.

Featured image from Pixabay.

Think about the end-user experience

I recently marked 30 years working full-time in the IT industry. That’s a long time. When I started, we didn’t all have laptops (I shared a PC in the office), we had phones on desks, administrators to help with our… administration, and work was a place where we went as well as a thing that we did.

Over time, I’ve seen a lot of change: new systems, processes, ways of working. But right now is the biggest of them all. For the last nine-and-a-half years, all of my work has been stored in one Office 365 tenant. Now it’s being migrated to another, as part of some cleanup from a merger/acquisition that took place a while ago.

I’m just a normal end-user

I’m just a user. Albeit one with a technical background. And maybe that’s why I’m concerned. During the Covid-19 pandemic, I was issued a new laptop and everything was rebuilt using self-service. It went very well, but this is different. There is no going back. Once my laptop has been wiped and rebuilt into the new organisation, there is no “old machine” to go back to if I incorrectly synced my data.

Sure, I’ve done this before – but only when I’ve left one employer to go somewhere else. Never when I’ve been trying to continue work in the same place.

People change management

To be clear, the migration team has been great. This is not your typical internal IT project – this is being run properly. There are end-user communications, assigned tasks to complete to help make sure everything goes smoothly, FAQs, migration guides. ADKAR is in full flow. It all looks like it should go swimmingly. But 30 years of working in tech tells me to expect the unexpected (plus a tendency to be over-anxious and to catastrophise). And modern security practices mean that, if I was to make a copy of all my data on an external drive, “just in case”, I’ll set all sorts of alarm bells ringing in the SOC.

I’ll have to roll with it.

The schedule

So, there’s the technical issues resolved – or at least put to one side. Next is the migration window. It runs for 2 weeks. But the second of those weeks is the school half term holidays in a sizeable chunk of the UK. I, for one, will be away from work. I also have an assignment to complete by the end of the month, all the usual pre-holiday preparations squaring work away, and this is whilst I have two days taking part in an AI hackathon event and two days when I’m on call for questions in relation to our Microsoft Azure Expert MSP audit. “I’m sorry, I can’t find that information right now because I’m rebuilding my PC and migrating between Microsoft 365 tenants” isn’t going to go down well.

In short, there is no good time for my migration. And this is what it’s like for “real” end-users in our clients’ organisations. When they don’t want to clear down their email or delete old data it’s (generally) not because they are awkward (well, not always). They have a job to do, and we (IT) are doing something with one of the tools that they use to do that job. There’s uncertainty about how things will work after the migration and they need to allocate time. Time that they may not have.

Walking in someone else’s shoes

All too often, us IT folks just say “it’ll be fine”, without understanding the uncertainty that we impose on our customers – the users of the systems that we manage. Maybe it’s good for me to stand in their shoes, to be a typical business end-user, to understand what it’s like to be on the end of an IT project. Maybe we should all do it more often, and then we can run better projects.

Featured image by Kosta from Pixabay.