Short takes: SSH, custom ports, root and Synology NASs

This content is 8 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

This blog has been much maligned of late… I’d like to get more time to write and I have literally hundreds of part-written posts, some of which are now just a collection of links for me to unpick…

In the meantime, a couple of snippets that may be useless, or may help someone one day…

Using SSH with a custom port number

My Synology NAS complains about poor security if I leave SSH enabled on port 22. It’s fine if I change it to another port though (security by obscurity!). Connecting then needs a bit more work as it’s ssh user@ipaddress -p portnumber (found via the askubuntu forums)

Logging on to a Synology NAS from SSH as root

On a related topic, I recently needed to SSH to my NAS as root (not admin). ssh root@ipaddress -p portnumber wasn’t authenticating correctly and then I found Synology’s advice on how to login to DSM with root permission via SSH/Telnet. It seems I have to first log on as admin, then sudo -i to elevate to root.

Synology Hyper Backup and DSM update failures

This content is 8 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I have a Synology DS916+ NAS and, for the 9 months or so, I’ve been using it to back up my photos to Microsoft Azure. I’ve realised that they are being backed up in a format that’s unique to Synology’s Hyper Backup program, so I should probably see if there is an alternative that backs up the files in their native format but, more worryingly, this afternoon I noticed that backups had been failing for a few days. The logs weren’t much help (no detailed information) and a search on the ‘net didn’t turn much up either. For reference, this was the (very high level) information in the logs when viewed in the Hyper Backup GUI:

Information 2017/07/08 03:00:02 SYSTEM [Azure Blob] [Backup Photos to Microsoft Azure] Backup task started.
Error 2017/07/08 03:00:33 SYSTEM [Azure Blob] [Backup Photos to Microsoft Azure] Exception occured while backing up data.
Error 2017/07/08 03:00:36 SYSTEM [Azure Blob] [Backup Photos to Microsoft Azure] Failed to backup data.
Error 2017/07/08 03:00:36 SYSTEM [Azure Blob] [Backup Photos to Microsoft Azure] Failed to run backup task.

(Since then, I’ve found how to view detailed backup logs on a Synology NAS, thanks to a blog post by Jonathan Mumm, though in this case, the logs didn’t shine much of a light on the problem for me.)

I wondered if there were any DSM updates available that might fix things but, when I checked for updates, I got a message to say “Insufficient capacity for update. The system partition requires at least 400MB”. Googling suggested lots of manual file deletion and I was sure this was just a buildup of temp files (maybe to do with the failed backup), so I decided to reboot. After all, what do you do when a computer isn’t working as expected? Turn it off and on again!

After rebooting, attempts to update no longer produced an error (simply confirming that I’m up-to-date with DSM 6.1.2-15132) and the backup is now running nicely (it will take a few hours to complete as I added a few months’ worth of iPhone photos to the NAS earlier in the week, around about the time the backups started failing…)

VPN, DirectAccess or Windows 10 auto-trigger VPN profile?

This content is 8 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

On a recent consulting gig, I found myself advising a customer who was keen to deploy Microsoft DirectAccess (DA) in place of their legacy virtual private network (VPN) solution. As a DirectAccess user (who used Cisco AnyConnect VPN at my last place of work), I have to say the convenience of being always connected to the company network without any interaction on my part is awesome. I’m sure the IT guys like that they can always access my PC for management purposes too…

The trouble with DirectAccess is that it doesn’t seem to have a published roadmap. So, should I really be advising my customers to use a technology that doesn’t seem to be being developed? First of all, I should add that it’s not been deprecated. DirectAccess is still a supported feature in Windows Server 2016 (it’s part of the Remote Access server role) – so it’s still got a future. Annoyingly, it’s not a supported workload on Azure (leading to on-premises deployments) but we can’t have everything…

Now for the question of whether to use DA or a traditional VPN. Well, Microsoft MVP Richard Hicks (@RichardHicks) has written a fantastic blog post that goes through this in detail. Rather than paraphrasing, I’ll suggest that you go and read Richard’s post on DirectAccess vs. VPN.

But that’s not the whole picture… you see Windows 10 has a new auto-triggered VPN profile capability that I’m sure will, in time, replace DirectAccess. So, where does that fit in?

Great response there from Richard, and then my colleague Steve Harwood (@steveeh) joined in, advising that Auto VPN still requires a VPN profile and infrastructure but gets initiated through either a Universal Windows Platform (UWP) or desktop app being started or stopped, meanwhile DirectAccess has other benefits from being always-on avoiding the need to expose management/compliance systems publicly.

Actually, it gets a bit better with the Windows 10 Anniversary Update (RedStone 1/1607), which has the Always On VPN profile option, but we’re still Windows-only at this point. Richard has recommended a DirectAccess alternative for Windows, MacOS, iOS and Android:

So if the question is “should you deploy DirectAccess?”, the answer is “maybe”. It’s a Windows Enterprise-only solution but, if you have other clients in your enterprise, you might want to consider alternatives instead of or alongside DA.

Auto-responder for blog marketing requests…

This content is 8 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Having a popular blog is great. Mine’s probably not as popular as it once was – mostly that’s because I don’t get the time to write all the content that I would like to – but there are still more than 2000 posts here, so I do see a reasonable volume of traffic.

Unfortunately, that also means I get a lot of emails (sometimes several a day) asking me to add a link/feature some content/something else – much of which is clearly scripted bulk email. And not replying only results in multiple chaser emails… so I’m fighting back with my own scripted response (I actually got the idea from a journalist who provided advice to PR teams to help them only pitch items he’d be interested in…):

“Hi,

You’re receiving this email because you recently emailed about the website at markwilson.it/markwilson.co.uk. Thanks for getting in touch; however, I receive several emails each day that take a lot of time to respond to (or multiple chaser emails if I don’t respond) so please don’t be offended by this automatic reply.

  • If you’re looking to place ads on my site, please don’t ask me what I would charge. Instead, please make me an offer. I don’t really know what the market rates are but you probably do. Please also include details of the page you’d like to advertise on, the landing page you would like and the period you would like to advertise for. I’ll only advertise sites that I think will be relevant to my readers so please don’t be offended if I don’t reply.
  • If you have a great resource that you’re sure would improve my content, please consider that markwilson.it is a blog. I’m not going to go back and edit posts from months or years past but you could always leave a comment on a post instead, as long as it’s genuine and not just spam.
  • If you’re offering to create content, please note that the content on the site is all written by me or by one of a very small number of trusted colleagues or family. I do not feature content written by others to promote their goods and services. If you’re starting out as a writer, I wish you well but would politely suggest you write on a public platform – or maybe start your own blog.

Thanks for your understanding.

Mark”

It’ll probably make no difference at all… but at least I can legitimately ignore repeated requests that haven’t acted on my reply…

Removing the residue left behind by stickers on a laptop

This content is 8 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Yesterday, I was at an event where, during a discussion on developers becoming evangelistic on their various technology choices, another delegate referred to the “stickers and t-shirt” brigade. That made me laugh (and he was joking) as my Surface Pro wears quite a few stickers (though I struggle to find good ones for Microsoft products…) and only the night before I’d been removing one that was dragging down the overall tone.

After removing a large sticker that was starting to look a bit scruffy (using a plastic spatula to try and prise it away in pieces), I was left with a lot of sticky mess. Inspired by a WikiHow article on removing stickers from a laptop, I first used some cooking oil, then some window-cleaning fluid and finally a baby wipe to remove the glue, leaving the surface clean.

These may sound like strange materials for removing stickers but I didn’t want to risk anything stronger as it might damage the paint on the device (which isn’t actually mine – it’s my work PC). The end result is some pristine new real estate for a new batch of stickers (maybe there will be some at the Azure Red Shirt Developer event tomorrow…).

Facebook’s Restricted list

This content is 8 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Imagine the situation, a family member befriends you on Facebook and you foolishly accept, then find their replies on your posts to be inappropriate or annoying… after all, you can choose your friends but not your family, right?Well, it turns out that

Well, it turns out that Facebook has a feature for situations like this – the Restricted list.

“Putting someone on the Restricted list means that you’re still friends, but that you only share your posts with them when you choose Public as the audience, or when you tag them in the post.

For example, if you’re friends with your boss and you put them on your Restricted list, then post a photo and choose Friends as the audience, you aren’t sharing that photo with your boss, or anyone else on your Restricted list. However, if you tag your boss in the photo, or chose Public as the audience, they’ll be able to see the photo.”

May be useful to know…

The Windows Network Connection Status Icon (NCSI)

This content is 8 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last night, whilst working in the Premier Inn close to the office, I noticed the browser going to an interesting URI after I connected to the hotel Wi-Fi.  That URI was http://www.msftconnecttest.com/redirect and a little more research tells me it’s used by Windows 10 to detect whether the PC has an Internet connection or not.

The feature is actually the Network Connection Status Icon (NCSI) and, more accurately, the URIs used are:

The URI I saw actually redirects to MSN whereas the ones above return static text to indicate a successful connection.

For those who want to know more, there’s a detailed technical reference on TechNet, which dates back to Windows Vista and an extensive blog post on the Network Connection Status Icon.

Do we need another as-a-service to describe functions?

This content is 8 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last week saw quarterly earnings reports for major cloud vendors and this tweet caught my eye:

You see, despite Azure growing by 93%, this suggests that Amazon has the cloud market sewn up. Except I’m not sure they do…

I think it would be interesting to see this separated into infrastructure-, platform- and software-as-a-service (IaaS/PaaS/SaaS). I suggest that would present three very different stories. And I’d expect that Amazon would only really be way out front for IaaS.

My friend and former colleague, Garry Martin (@GarryMartin) questioned the relevance of those “legacy” distinctions but I think they still have value today.

In the early days of what we now recognise as cloud computing, every vendor was applying their own brand of cloud-washing. It still happens today, with vendors claiming to offer IaaS when really they have a hosted service and a traditional delivery model.

Back in 2011, the US National Institute of Standards and Technology (NIST) defined cloud computing, including the service models of IaaS, PaaS and SaaS. Those service models, along with the (also abused) deployment models (public cloud, private cloud, etc.) have served us well but are they really legacy?

I don’t think they are. Six years is a long time in IT, let alone the cloud but I think IaaS, PaaS and SaaS are as relevant today as they were when NIST wrote their definition.

When asked how “serverless” technologies like AWS Lambda, Azure Functions or Google Cloud Functions fit in, I say they’re just PaaS. Done right.

Some people want to add another service model/definition for Function-as-a-Service (FaaS). But why? What value does it add? Functions are just PaaS but we’ve finally evolved to a place where we are moving past the point of caring about what the code runs on and letting the cloud manage that for us. That’s what PaaS has supposed to have been doing for years (after all, should I really need to define a number of instances to run my web application – that all sounds a bit like virtual machines to me…)

To my mind, “serverless” is just the ultimate platform as a service and we really don’t need another service model to describe it.

To quote a haiku from Onsi Fakhouri (@onsijoe):

“Here is my code
Run it in the cloud for me
I don’t care how”

Or, as Simon Wardley (@swardley) “fixed” this Cloud Foundry diagram:

Outlook gotcha: only cached data is exported to data file (.PST)

This content is 8 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

This weekend, a family project that required its own mailbox ended, meaning I could reduce the number of licences in my Exchange Online subscription. That’s straightforward enough but I wanted to take a backup copy of the email before cutting the mailbox loose.

From the last time I did any Exchange Online administration, I recalled that one of the limitations was that you can’t back up a mailbox to a PST from PowerShell. That may have changed but the advice at the time was to backup to an Outlook data file (also known as a Personal Folder) in Outlook. It’s clunky but at least it’s functional.

I couldn’t work out why not all of the data was being exported; only the items that were cached and not the ones that appeared if I clicked on “There are more items in this folder on the server/click here to view more on Microsoft Exchange”. Then I found a clue in a Spiceworks post from Joe Fenninger, where Joe says “Dont [sic] forget to download all [Office 365] content prior to export.”.

I needed to adjust the cached mode settings for the mailbox to change how much email is kept offline, after which Outlook could export all items to the Outlook Data File, rather than just the ones that were cached locally.

Securing the modern productive enterprise with Microsoft technology

This content is 8 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

“Cybercrime costs projected to reach $2 trillion by 2019” [Forbes, 2016]

99: The median number of days that attackers reside within a victim’s network before detection [Mandiant/FireEye M-Trends Report, 2017]

“More than 63% of all network intrusions are due to compromised user credentials” [Microsoft]

The effects of cybercrime are tremendous, impacting a company’s financial standing, reputation and ultimately its ability to provide security of employment to its staff. Nevertheless, organisations can protect themselves. Mitigating the risks of cyber-attack can be achieved by applying people, process and technology to reduce the possibility of attack.

Fellow risual architect Tim Siddle (@tim_siddle) and I have published a white paper that looks at how Microsoft technology can be used to secure the modern productive enterprise. The tools we describe are part of Office 365, Enterprise Mobility + Security, or enterprise editions of Windows 10. Together they can replace many point solutions and provide a holistic view, drawing on Microsoft’s massive intelligent security graph.

Read more in the white paper:

Securing the modern productive enterprise with Microsoft technology