More SharePoint shenanigans

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

This week, I’ave mostly been working in SharePoint (those of a certain age may spot the reference to Jesse from the Fast Show?)

Earlier this month, I wrote a post with a few hints and tips I’d picked up whilst developing a site based on SharePoint.  Since then, I’ve come up against a few more barriers…

Item level permissions on document libraries

SharePoint allows administrators to set permissions on lists so that users can read and/or edit only their own items. Unfortunately, whilst that functionality is exposed in the user interface for lists, it’s not for document libraries (although it is in the object model). Basically, if you can write some code, you might be able to set the requisite permissions, but that’s beyond my abilities.  Thankfully, others have done the legwork, either in the form of a rough-and-ready utility like Matt Morse’s or Tim Larson’s Read/Write Security solution.  Either way I need to get the the changes applied to our SharePoint farm… which may well be more trouble than it’s worth…

Changing the contact for a page

Even though there is a page contact in the properties for each page (at least there is in my company – that might just be metadata that we use…), changing that property doesn’t seem to affect the page contact (which our templates show at the bottom of the page – and is also shown in a view of all pages).  The answer is to edit the page settings (in the same way as to change the page layout). Thanks to Ian Mitchell (@ianmitchell2) for setting me on the right path there…

A calculated column, with a formula based on a Yes/No field

I wanted to display a tick or a cross instead of a yes or a no in a view on one of my lists and, in order to do this I needed to create a calculated column that produced the necessary HTML (and a script to display it…). I’ll write another post about the fancy formatting but I really struggled to work out how the Yes/No is recorded (Yes/No; 1/0; TRUE/FALSE?). Exporting my list to Excel proved that SharePoint stores boolean values as TRUE/FALSE (confirmed by Peter Allen) but the trick is to leave out any quotes – if you look for =IF(Column="TRUE","This","That") it will always be negative and the outcome will be “That”.  The correct formula is =IF(Column=TRUE,"This","That").

Editing in data sheet view but the view is read-only?

I needed to perform some bulk updates on lists in SharePoint but, frustratingly, the view was marked as Read Only so I couldn’t make any edits.  I couldn’t see why this was, but googling turned up an explanation – the list was set to require content approval.  PointBeyond has more information on configuring approval in SharePoint but temporarily removing this setting allowed me to make the necessary updates, before re-enabling it.

Connecting data in web parts

Back in around 2003/2004, I remember attending a SharePoint training course where I connected a couple of webparts to work together.  For someone like myself with little or no coding skills, this was magical… and then I forgot how to do it.  Yesterday, I ran up against an issue where, partly as a result of some database design decisions by the previous  designer, I found myself unable to display the view on a list that I wanted to, as there was an implied hierarchy in the data, but the lower levels in the hierarchy link to their parent, rather than parents linking to children (it would be better still if things could work both ways…).

The workaround, albeit clunky, was to configure two webparts, each showing a view on a different list, before configuring a data connection so that the first list provided a row upon which to filter the second list.  I needed to be careful in selecting columns (i.e. the second list needs to have a column that is a lookup on the first) but, with that in place, I was able to at least show the relationships between the items in each list.

Wake on LAN braindump

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I lost quite a bit of sleep over the last few nights, burning the midnight oil trying to get my Dell PowerEdge 840 (server repurposed as a workstation) to work with various Dell management utilities and enable Wake On LAN (WoL) functionality.

It seems that the various OpenManage tools were no help – indeed many of the information sources I found for configuring the Baseboard Management Controller and kicking SOLProxy and IMPI into life seemed to be out of date, or just not applicable on Windows 7 (although ipmish.exe might be a useful tool if I get it working in future and it can be used to send WoL packets). I did find that, annoyingly, WinRM 2.0 needs an HTTPS connection and that a self-signed certificate will not be acceptable (according to Microsoft knowledge base article 2019527).  If I ever return to the topic of WinRM and IPMI, there’s a useful MSDN article on installation and configuration for Windows Remote Management.

In the end, even though my system is running Windows 7, the answer was contained in a blog post about a PowerEdge 1750, WoL and Debian

“Pressing ‘CTRL-S’ brings us to a configuration panel which allows for enabling the Wake-On-LAN (WOL) mode of the card.”

I’d been ignoring this because it the Ctrl-S boot option advertises itself as the “Broadcom NetXtreme Ethernet Boot Agent” (and I didn’t want to set the machine up to PXE boot) but, sure enough, after changing the Pre-boot Wake On LAN setting to Enable, my PowerEdge 840 started responding to magic packets.

On my WoL adventure, I’d picked up a few more hints/tips too, so I thought it’s worth blogging them for anyone else looking to follow a similar path…

“Windows 2000 and Windows 2003 do not require that WOL be turned on in the NIC’s or LOM’s firmware, therefore the steps using DOS outlined in the Out?of?Box and Windows NT 4.0 procedures are not necessary and should be skipped.  Enabling WOL with IBAUTIL.EXE, UXDIAG.EXE or B57UDIAG.EXE may be detrimental to WOL under Windows 2000 and Windows 2003.”

    • Presumably this advice also applies to Windows XP, Vista, Server 2008, 7 and Server 2008 R2 as they are also based on the NT kernel, so there is no need to mess around with DOS images and floppy drives to try and configure the NIC…
  • I downloaded Broadcom’s own version (15.0.0.21 19/10/2011) of the Windows drivers for my NIC (even though Windows said that the Microsoft-supplied drivers were current) and I’m pretty sure (although I can’t be certain) that the Broadcom driver exposed advanced NIC properties that were not previously visible to control Wake Up Capabilities and WoL Speed. (Incidentally, I left all three power management checkboxes selected, including “Only allow a magic packet to wake the computer”). There’s more information on these options in the Broadcom Ethernet NIC FAQs.
  • There is a useful-sounding CLI utility called the Broadcom Advanced Control Suite that I didn’t need to download; however its existence might be useful to others.
  • Depicus (Brian Slack) has some fantastic free utilities (and a host of information about WoL) including:
  • Other WoL tools (although I think Depicus has the landscape pretty much covered) include:
  • There’s also some more information about WoL on Lifehacker.

Fixing a Dell server that required F1 on every boot

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last weekend, I dusted off (literally), my Dell PowerEdge 840 that was retired in favour of a low-power server a couple of years ago. My employer’s IT policies are making it harder and harder to do any personal computing from work (I know my laptop is for work but there’s a big grey area between work and play these days) and whilst the Mac Mini is fine for music, a bit of browsing and email, I wanted something a bit more “heavy duty” for some of my home computing needs.  With 8GB of RAM and a Quad core Xeon CPU, my old server is a pretty good workstation (7.0 on the Windows Performance Index for CPU and memory, 5.9 for primary hard disk, but only 1.0 for graphics!) and so it’s been brought back into service as a Windows 7 PC.

Unfortunately, every time I booted it, I had to press F1, until I worked out that it was still looking for some hard disks that I had removed.  Delving into the BIOS and switching the spare SATA ports to Off, rather than Auto, sorted out the problem and now the system boots without issue.

Half-baked cookies…

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I don’t know if this website uses cookies. I think it probably does beacuse I have Google Adsense code and Google Analytics code in place. It wouldn’t surprise me if WordPress uses some cookies too but, like many bloggers, I use off-the-shelf software and, as long as it works, I don’t worry too much about how things happen.

Unfortunately, some half-baked EU directive about privacy and cookies (half-baked – get it…) takes effect this month after even the UK government needed a year to get its act together (the Information Comissionners Office, which is responsible for enforcing the associated UK legislation, only removed its last cookie in March).

What’s worse is that the ICO’s guidance for website owners is really difficult to follow. Peter Bryant (@PJBryant) pointed me at an article in PC Pro magazine that suggests I should be OK without doing anything, meanwhile Kuan Hon (@Kuan0) from the Cloud Legal Project at Queen Mary University suggested a few weeks ago that we all need to be looking carefully at our sites if we want to avoid a fine…

I’m no lawyer and I can’t afford to be paying fines so I checked out some WordPress plugins that might help me. Some were linked to websites that should check my site for cookies… except they didn’t seem to work – and, anyway, I don’t really want to be making a big deal about cookies (they are, mostly, harmless).

I selected a very simple plug-in called Cookie Warning that presents a message (importantly, not a pop-up) to first time site visitors. The message is customisable (although changing the size of the text on the buttons will involve me editing the plugin) and it seems to be enough for me to gain consent from users. Importantly, it doesn’t seem to impact the way in which search engines see the site.

Only time will tell if this change negatively impacts my traffic – I’d like to think that most of my visitors understand enough about cookies to realise that this is not really such a big deal – but it will be interesting to see how this pans out over the next few months as companies big and small update their sites to comply with the legislation.

A Microsoft view on the consumerisation of IT (#ukitcamp)

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I never realised that my blog posts were feared. At least not until Microsoft’s Andrew Fryer (@deepfat) said he was less concerned about my event feedback on yesterday’s IT Pro Camp event than on my blog post! Well, all I can promise is to try and be objective, fair and balanced – which is what readers have come to expect around here – even if there is less Microsoft-focused content these days.

I went along to yesterday’s IT Pro Camp on Consumerisation as a result of a Twitter conversation that suggested I come and see what Microsoft is doing to embrace and support consumerisation.  To be fair, I should have known better. For the last 20 years, Microsoft has provided desktop (and back-office) systems to enterprises and the consumerisation megatrend threatens this hegemony. Sure, they also operate in the consumer space, but consumerisation is increasingly mobile and cross-platform which means that Microsoft’s dominance is weakening*.

What the UK TechNet team has done is to put together a workshop that looks at how Microsoft tools can be used to support consumerisation in the enterprise – and, at that level, it worked well (although I’m pretty sure the event synopsis changed at some point between me booking my place and it actually taking place).  Even so, I was naive to expect anything more than marketing. Indeed, I nearly went home at lunchtime as it was starting to feel like a big System Center Configuration Manager pitch and there was very little discussion of what is really meant by the consumerisation of IT.

There is little doubt in my mind that the event provided a great demo to show off a host of functionality in Microsoft’s products (and, to be fair, there is an increasing amount of cross-platform support too) but, time and time again, I was the awkward so-and-so who asked how I would implement a feature (for example Direct Access) in a cross-platform estate (e.g. for BYOD) and the answer was that it needs Windows.

So, earlier in the week I was slating Oracle for an event that basically said “buy more of our stuff” and this week… well, it’s just “stuff” from Redmond instead of (different) “stuff” from Redwood Shores, I guess.

Even so, there were some snippets within the product demos that I would like to call out – for example, Simon May (@simonster)’s assertion that:

“We need to be more permissive of what’s allowed on the network – it’s easier to give access to 80% most of time and concentrate on securing the 20%.”

In a nutshell, Simon is re-enforcing the point I made earlier this month when I suggested that network access control was outdated and de-perimiterisation is the way forward (although Microsoft’s implementation of NAC – called Network Access Protection – did feature in a demonstration).  There was also a practical demonstration of how to segregate traffic so that the crown jewels are safe in a world of open access (using IPsec) and, although the Windows implementation is simpler through the use of Group Policy, this will at least work on other devices (Macs and Linux PCs at least – I’m not so sure about mobile clients).

Of course, hosted shared desktops (Remote Desktop Services) and virtual desktop infrastructure reared their ugly heads but it’s important to realise these are just tactical solutions – sticking plaster if you like – until we finally break free from a desktop-centric approach and truly embrace the App Internet, with data-centric policies to providing access.

There was no discussion of how to make the App Internet real (aside from App-V demos and SharePoint/System Centre Configuration Manager application portals) but, then again, this was an IT Pro event and not for developers – so maybe a discussion on application architecture was asking a little too much…

Other topics included protection of mobile devices, digital rights management, and federation, featuring a great analogy from Simon as he described claims-based authentication as being a bit like attempting to buy a drink in a bar, and being asked to prove your age, with a driving licence, that’s trusted because the issuer (e.g. DVLA in mainland Britain) has gone through rigourous checks.

Hopefully this post isn’t too critical – my feedback basically said that there is undoubtedly a lot of work that’s gone into creating the TechDays IT Pro Camps and for many people they will be valuable. Indeed, even for me (I haven’t been involved in Microsoft products, except as a user, for a couple of years now) it’s been a great refresher/update on some of the new technologies. But maybe IT architects have a different view? Or maybe it’s time for me to get more intimately involved in technology again?

 

* I don’t see Microsoft being pushed out any time soon – Windows still runs on a billion PCs worldwide and analysts haven’t given up hope on Windows Phone either – at least not based on an IDC event I attended recently.

Getting started with Raspberry Pi (#RasPi)

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Raspberry Pi is a trademark of the Raspberry Pi FoundationMuch to my manager’s disgust (he has a programming background, whilst I’m an infrastructure guy “by trade” – although I did write code in my youth!), my Raspberry Pi arrived last week. Despite the botched launch, I still think this is one of the most exciting products we’ll see this year because, well, because it’s a fully functioning computer for around £25 (Model B) and that means the potential addressable market is enormous. Actually, that’s not quite right – the Pi is around £25 (plus VAT) and then you’ll need some peripherals – although they should be relatively easy to lay your hands on:

  • A micro-USB mobile phone charger (I use the one that came with my Nokia Lumia 800 but any 5V supply that can feed a micro-USB cable will do)
  • A USB keyboard
  • (Optionally) a mouse
  • (Optionally) some speakers
  • (Optionally) a USB hub (powered)
  • A wired network connection
  • An SD card
  • A display – but watch out as Raspberry Pi supports HDMI and component out (RCA) – not VGA.

My monitors are mostly VGA (I have one that will take DVI) and my TV is far too old for HDMI (it’s a 14-year-old Sony Trinitron 32″ widescreen CRT!) so I set the Pi up to use the analogue  connection to the TV.

Installing the operating system

I selected a Linux distro (the Raspberry Pi blog suggests that Fedora Remix is the recommended distro, as does the FAQ, although there is extensive discussion about whether to use Fedora or Debianthe Raspberry Pi quick start guide suggests that developers should use Debian and there are alternative downloads too). Eventually, I managed to install the Raspberry Pi Fedora Remix on my SD card (my Ubuntu machine recognised the SD card, but the Python version of the Fedora ARM Image Installer didn’t*; meanwhile my work laptop installed an image on the SD card but it wouldn’t boot – I suspect that’s down to the disk encryption software we use; finally I managed to run the Windows version of the Fedora ARM Image Installer on another Windows 7 PC).

Once I had an operating system installed, I booted and the RasPi picked up an IP address from my DHCP server, registered itself in DNS (raspi.domainname) and set to work expanding its disk to fill the 8GB SD card I’m using.

*getting this installer to work involved installing the python-qt4 package in the Ubuntu Software Centre, then running ./fedora-arm-installer.

Switching displays

Unfortunately, standard definition CRT TVs are no better at working with Raspberry Pi’s than they are with any other computer (except a games console) – and why I thought that should be the case is a mystery…

With only part of the display visible via component out (and not exactly easy to read) I started to investigate options for use of the HDMI port.  It turns out that HDMI to VGA is too expensive, but an HDMI to DVI cable cost just £2.39 at Amazon (thanks to Chromatix, The EponymousBob and GrumpyOldGit on the Raspberry Pi forums for sharing this info). With the RasPi hooked up to my only digital monitor, everything was much easier, although I did have to plug the cable directly into the monitor and I’m now waiting for delivery of a DVI-I female to female gender changer so that it’s a bit easier to swap the monitor cable between my computing devices.

So, what’s it like to use then?

Did I mention that the Raspberry Pi is a fully functioning computer for around £25? Well then, what’s not to like? Sure, performance is not lightning fast – the Raspberry Pi FAQs suggest:

“… real world performance is something like a 300MHz Pentium 2, only with much, much swankier graphics”

but that’s plenty for a bit of surfing, email and teaching my kids to write code.

I am finding though that I’m struggling a little with my chosen distro. For example, I haven’t yet managed to install Scratch and it doesn’t seem to be one of the recognised packages so I may have to resort to compiling from source – hardly ideal for getting kids started with coding. For that reason, I might switch to Debian (I’m downloading it as I write) but for now I’ll continue to explore the options that the Fedora Remix provides.

I’m sure there will be more RasPi posts on this blog but if you’re one of the thousands waiting for yours to arrive, hopefully this post will help to prepare…

And once the educational models are available, I’ll be encouraging my sons’ school to buy a lab full of these instead of a load more netbooks running Windows XP…

Big data according to the Oracle

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

After many years of working mostly with Microsoft infrastructure products, the time came for me to increase my breadth of knowledge and, with that, comes the opportunity to take a look at what some of the other big players in our industry are up to.  Last year, I was invited to attend the Oracle UK User Group Conference where I had my first experience of the world of Oracle applications; and last week I was at the Oracle Big Data and Extreme Analytics Summit in Manchester, where Fujitsu was one of the sponsors (and an extract from one of my white papers was in the conference programme).

It was a full day of presentations and I’m not sure that reproducing all of the content here makes a lot of sense, so here’s an attempt to summarise it… although even a summary could be a long post…

Big data trends, techniques and opportunities

Tim Jennings (@tjennings) from Ovum set the scene and explained some of the ways in which big data has the potential to change the way in which we work as businesses, citizens and consumers (across a variety of sectors).

Summing up his excellent overview of big data trends, techniques and opportunities, Tim’s key messages were that:

  1. Big data is characterised by volume, variety and velocity [I’d add value to that list].
  2. Big data represents a change in the mentality of analytics, away from precise analysis of well-bound sources to rough-cut exploratory analysis of all the data that’s practical to aggregate.
  3. Enterprise should identify business cases for big data and the techniques and processes required to exploit them.
  4. Enterprises should review existing business intelligence architectures and methods and plan the evolution towards a broader platform capable of handling the big data lifecycle.

And he closed by saying that “If you don’t think that big data is relevant to your organisation, then you are almost certainly missing an opportunity that others will take.”

Some other points I picked up from Tim’s presentation:

  • Big data is not so much unstructured as variably-structured.
  • The mean size of an analytical data set is 3TB (growing but not that huge) – don’t think you need petabytes of data for big data tools and techniques to be relevant.
  • Social network analytics is probably the world’s largest (free) marketing focus group!

Big Data – Are You Ready?

Following the analyst introduction, the event moved on to the vendor pitch.  This was structured around a set of videos which I’ve seen previously, in which a fictitious American organisation grapples with a big data challenge, using an over-sized actor (and an under-sized one) to prove their point. I found these videos a little tedious the first time I saw them, and this was the second viewing for me.  For those who haven’t had the privilege, the videos are on YouTube and I’ve embedded the first one below (you can find the links on an Oracle’s Data Warehouse Insider blog post).


The key points I picked up from this session were:

  • Oracle see big data as a process towards making better decisions based on four stages: decide, acquire, organise and analyse.
  • Oracle considers that there are three core technologies for big data: Oracle NoSQL, Hadoop, and R; brought together by Oracle Engineered Systems (AKA the “buy our stuff” pitch).

Cloudera

Had I been at the London event I would have been extremely privileged to see Doug Cutting, Hadoop creator and now Chief Architect at Cloudera speak about his work in this field.  Doug wasn’t available to speak at the Manchester event so Oracle showed us a pre-recorded interview.

For those who aren’t familiar with Cloudera (I wasn’t), it’s effectively a packaged open source big data solution (based on Hadoop and related technologies) providing an enterprise big data solution, with support.

The analogy given was that of a “big data operating system” with Cloudera doing for Hadoop what Red Hat does for Linux.

Perhaps most pertenent of Doug Cutting’s commenst was that we are at the beginning of a revolution in data processing where people can afford to save data and use it to learn, to get a “higher resolution picture of what’s going on and use it to make more informed decisions”.

Capturing the asset – acquire and organise

After a short pitch from Infosys (who have a packaged data platform, although personally, I’d be looking to the cloud…) and an especially cringeworthy spoof Lady Gaga video (JavaZone’s Lady Java) we moved on to enterprise NoSQL. In effect, Oracle has created a NoSQL database using the Berkeley key value database and a Java driver (containing much of the logic to avoid single points of failure) that they claim offers a simple data model, scalability, high availability, transparent load balancing and simple administration.

Above all, Oracle’s view is that, because it’s provided and maintained by Oracle, there is a “single throat to choke”.  In effect, in the same way that we used to say no-one got fired for buying IBM, they are suggesting no-one gets fired for buying Oracle.

That may be true, but it’s my understanding that big data is fuelled by low-cost commodity hardware (infrastructure as a service) and open source software – and whilst Oracle may have a claim on the open source front, the low-cost commodity hardware angle is not one that sits well in the Oracle stable…

Through partnership with Cloudera (which leaves some wondering if  that will last any longer than the Red Hat partnership did?), Oracle is positioning a Hadoop solution for their customer base:

Oracle describe Cloudera as the Redhat for Hadoop, but also say they won't develop their own release; they said that for Linux originally
@debralilley
Debra Lilley

Despite (or maybe in spite of) the overview of HDFS and MapReduce, I’m still not sure how Cloudera  sits alongside Oracle NoSQL but their “big data appliance” includes both options. Now, when I used to install servers, appliances were typically 1U “pizza box” servers. Then they got virtualised – but now it seems they have grown to become whole racks (Oracle) or even whole containers (Microsoft).

Oracle’s view on big data is that we can:

  1. Acquire data with their Big Data Appliance.
  2. Organise/Analyse aggregated results with Exadata.
  3. Decide at “the speed of thought” with Exalytics.

That’s a lot of Oracle hardware and software…

In an attempt not to position Oracle’s more traditional products as old hat, the next presenter suggested that big data is complementary and not really about old and new but about familiar and unfamiliar. Actually, I think he has a point: at some point “big” data just becomes “data” (and gets boring again?) but this session gave an overview of an information architecture challenge as new classes of data (videos and images, documents, social data, machine-generated data, etc.) create a divide between transactional data and big data, which is not really unstructured but better described as semi-structured and which uses sandboxes to analyse and discover new meaning from data.

Oracle has big data connectors to integrate with other (Oracle) solutions including: a HiveQL-based data integrator; a loader to move Hadoop data into Oracle 11G; a SQL-HDFS connector; and an R connector to run scripts with API access to both Hadoop and more traditional Oracle databases. There are also Oracle products such as GoldenGate to replicate data in heterogeneous data environments

[My view, for what it’s worth, is that we shouldn’t be moving big data around, duplicating (or triplicating) data – we should be linking and indexing it to bridge the divide between the various silos of “big” data and “traditional” data.]

Finding the value – analyse and decide

Speaking of a race to gain insight analytics becoming the CIO’s top priority for 2013 and business intelligence usage doubling by 2014, the next session looked at some business analytics techniques and characteristics, which can be summarised as:

  • I suspect something – a data scientist or analyst needs to find proof and turn into a predictive model to deploy into business process (classification).
  • I want to know if that matters – “I wish I knew” (visual exploration and discovery).
  • I want to make the best decision now – decisions at the speed of thought in the context of a business process.

This led on to a presentation about the rise of the data scientist and making maths cool (except it didn’t, especially with a demo of some not-very-attractive visualisations run on an outdated  Windows XP platform) and introduction of the R language for statistical analysis and visualisation.

Following this was a presentation about Oracle’s recently-acquired Endeca technology which actually sounds pretty interesting as it digests a variety of data sources and creates a data model with an information-discovery front-end that promises “the simplicity of search plus the power of BI”.

The last presentation of this segment looked at Oracle’s Exalytics in-memory database servers (a competitor to SAP Hana) bundling bsuiness intelligence software, adaptive in-memory caching (and columnar compression) with information discovery tools.

Wrap-up

I learned a lot about Oracle’s view of big data but that’s exactly what it was – one vendor’s view on this massively hyped and expanding market segment. For me, the most useful session of the day was from Ovum’s Tim Jennings and if that was all I took away, it would have been worthwhile.

In fairness, it was good to learn some more about the Oracle solutions too but I do wish vendors (including my own employer) would sometimes drop the blatant product marketing and consider the value of some vendor agnostic thought leadership. I truly believe that, by showing customers a genuine understanding of their business, the issues that they face and the directions that business and technology and heading in,  the solutions will sell themselves if they truly provide value. On the other hand, by telling me that Oracle has a complete, open and integrated solution for everything and what I really need is to buy more technology from the Oracle stack and… well, I’d better have a good story to convince the CFO that it’s worthwhile…

Slidedecks and other materials from the Oracle Big Data and Extreme Analytics Summit are available on the Oracle website.

A collection of SharePoint shortcuts

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I spent most of last Friday developing a business system with Microsoft Office SharePoint Server (2007).  I’ve worked on a few SharePoint sites over the years and I’m impressed at how much can be done using just standard functionality (lists, etc.) but, whilst the platform is powerful and flexible in many ways, it’s also intensely infuriating at times.

In developing this latest site, there were a few things that I had to Google for – and I’m hoping that posting them here might help others…

Changing the page layout

I created a new page using one of the templates provided for me by my IT department.  Unfortunately I found that the webpart layout was a little too restrictive and I needed to change the page layout.  I hunted around for a while (even after a colleague had told me to look for the page settings) and then I found a post by Shane Young that helped me out. As Shane descibes, the steps are:

  1. “Browse to the page
  2. Click Site Actions, Edit Page
  3. From the tool bar click Page
  4. In the drop down list click Page Settings
  5. Now pick your Page Layout
  6. Click OK”

With the new page layout in place I was able to get the page looking (almost) how I wanted.

Hiding the Title column from forms

My site is built around a document library with a number of columns. One of the default columns is called Title and it’s not really that useful to me as it really just duplicates the Name field (doubling up the details that users need to enter for a document in the library). I can always hide column from list views but I can’t delete it completely and the field still appears in forms. Sometimes, I repurpose Title by changing the column name but I can’t change the column type – it’s always a single line of text. Then I found John Owings’ post which describes the steps to hide the Title column from forms:

  1. “From the list view click Settings [then] List Settings
  2. On the Settings Screen, under the ‘General Settings’ heading, click ‘Advanced Settings’
  3. On the Advanced Settings screen click ‘Yes’ for the value: ‘Allow Management of Content Types?’
  4. Click ‘OK’
  5. Now, back on the Settings Screen, under the ‘Content Types’ heading, click ‘Item’
  6. On the Content Type Management Screen, under the ‘Columns’ section, click on the ‘Title’ column
  7. On the next screen click the radio button for ‘Hidden (Will not appear in forms)’
  8. Click ‘OK’”

Internal anchors

Whilst I’m sure it’s possible to use inline CSS, my SharePoint pages resort to some awful HTML hacks at times, like using tables for layout (and then having to mess around with valign directives and other such code that I haven’t used in about ten years…). I probably shouldn’t admit to such awful practices but I also had to relearn something I’d forgotten many years ago – the use of internal anchors within a page.

It’s worth noting though that, using SharePoint’s Rich Text Editor to create a link to #anchor actually created a link to http://server.domain.tld/layouts/RTE2PUEditor.aspx#anchor. I had to explicitly include the full pathname (e.g. http://server.domain.tld/Pages/Page.aspx#anchor) in the link in order to avoid this behaviour.

Short takes: the rise of the personal cloud; what’s in an app; and some thoughts on Oracle

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A few highlights from my week that didn’t grow into blog posts of their own…

Oracle: complete, open and integrated

I was at an Oracle event earlier this week when I heard the following comment that amused me somewhat (non-attributed to protect the not-so-innocent):

“Oracle likes to say they are complete, open and integrated – and they are:

  • Complete – as in ‘we have a long price list’.
  • Open – as in ‘we’re not arrogant enough to think you buy everything from us’.
  • Integrated – as in ‘if we own both sides of the connection we’ll sell you the integration’.”

I don’t have enough experience of working with Oracle to know how true that is, but it certainly fits the impression I have of the company… I wonder what the Microsoft equivalent would be…

The rise of the Personal Cloud

I’ve been catching up with reading the paper copy of Computing that arrives ever fortnight (and yes, I do prefer the dead tree edition – I wouldn’t generally read the online content without it). One of the main features on 22 March was about the rise of the personal cloud – a contentious topic among some, but one to ignore at your peril as I highlighted in a recent post.

One quote I particularly liked from the article was this one:

“The personal cloud isn’t so much the death of the PC as its demotion. The PC has become just another item in a growing arsenal of access devices.”

Now all we need is for a few more IT departments to wake up to this, and architect their enterprise to deliver device-agnostic services…

What’s app?

In another Computing article, I was reading about some of the technologies that Barclays is implementing and it was interesting to read COO Shaygan Kheradpir’s view on apps:

“Many […] tasks that happen on the front-line are […] app-oriented […].

And what are apps? They are deep and narrow. They’re not like PC applications, which are broad and shallow. You want apps to do one, often complex, task.”

Sounds like Unix to me! (but also pretty much nails why mobile apps work so well in small packages.)

Network access control does its job – but is a dirty network such a bad thing?

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Earlier this week, I was dumped from my email and intranet access (mid database update) as my employer’s VPN and endpoint protection conspired against me. It was several hours before I was finally back on the corporate network, meanwhile I could happily access services on the Internet (my personal cloud) and even corporate email using my mobile phone.

Of course, even IT service companies struggle with their infrastructure from time to time (and I should stress that this is a personal blog, that my comments are my own and not endorsed by my employer) but it raises a real issue – for years companies have defended our perimeters and built up defence-in-depth strategies with rings of security. Perhaps that approach is less valid as end users (consumers) are increasingly mobile and what we really need to do is look at the controls on our data and applications – perhaps a “dirty” network is not such a bad thing if the core services (datacentres, etc.) are adequately secured?

I’m not writing this to “out” my employer’s IT – generally it meets my needs and it’s important to note that I could still go into an office, or pick up email on my phone – but I’d be interested to hear the views of those who work in other organisations – especially as I intend to write a white paper on the subject…

In effect, with a “dirty” corporate network, the perimeter moves from the edge of the organisation to its core and office networks are no more secure than the Wi-Fi access provided to guests today – at the same time as many services move to the cloud. Indeed, why not go the whole way and switch from dedicated WAN links to using the public Internet (with adequate controls to encrypt payloads and to ensure continuity or service of course)? And surely there’s no need for a VPN when the applications are all provided as web services?

I’m not suggesting it’s a quick fix – but maybe something for many IT departments to consider in adapting to meet the demands of the “four forces of IT industry transformation”: cloud; mobility; big data/analytics and social business?

[Update: Neil Cockerham (@ncockerhreminded me of the term “de-perimiterisation” – and Ross Dawson (@rossdawson)’s post on tearing down the walls: the future of enterprise tech is exactly what I’m talking about…]