The Cloud Strategy Dinner

Posted by Kate Stevens and Adam Stead | 28-Jun-2018 09:39:47
The Cloud Strategy Dinner took place on 26 June at Searcys at the Gherkin. It was held in partnership with GTT, with the aim of exploring the future of cloud technology.

GTT is an enterprise communications company focused on cloud. 

At the dinner, members asked:

  • What is the future for cloud? Are multi-cloud architectures inevitable?
  • What are the principal challenges in delivering a successful cloud-first strategy?
  • How can geographical challenges be overcome without compromising network stability?
  • What are the critical components in enabling cutting-edge digital initiatives to flourish?

 

Setting the Table for Success

In business technologies, it’s sometimes useful to distinguish between “facilitators” and “success-makers”. One category of products represents success in and of itself. If you build an excellent optimization process, that’s its own reward: the ROI is intrinsic, and you can judge how well the product is working on day one. For other solutions, doing them badly precludes success, but doing them well does not guarantee it.

Obviously, this simplifies. Every bit of technology relevant to customers will have a confusing relationship with the product. But it’s still useful to distinguish: for a baby boomer CEO, it may be easier to understand a technical product as an investment than in its own terms; and when the “return” does not happen immediately, it’s essential to communicate.   

For older stakeholders unfamiliar with cloud, “old fashioned thinking” is often characterised by phobia. Within that, there’s often distaste for the short-term costs associated with “parallel running” (where data is held simultaneously in two types of storage; and so is more expensive to store – a necessary process for most cloud migrations). There is also often a discomfort with renting, when it’s possible to own, storage.

It’s essential to communicate that cloud use is now the table stakes for digital success. Also, that cloud migration will only complement the data strategy a company already has. There are three factors for success which work together: good data, a good place to keep the data, and a good place to process the data. Two of those come with cloud; those two are essential.

 

The Resident Data Scientist

There may be a bottleneck in building out a data science team to make the best use of your cloud data: data scientists (or lack thereof). The lack of supply has made good data scientists very dear.

But, machine learning technology is a data scientist. The principal difficulty here is one of communication and framing. ML can pull a lever, understand the outcome is positive, and continue pulling that lever.

But ML has no sense of irony: if the reward is misaligned with the desired outcome, it will keep pushing for the reward. For example, there have been systems which measure ‘engagement’ in terms of clicks, and which have optimised for “aggravated” clicks by making the experience actively annoying. That “common sense” element is more important in the data scientist than the data science itself.

In the quest to actively understand customers, there are trade-offs in whether humans or bots parse their data.

 

To Buy or to Build

Is it better to buy cloud or build your own?

On one hand, why would you bother to build a cloud? Do you have the same technical capability as Amazon? If the answer is (a dignified and acceptable) “of course not”, then to some extent the question has been answered. The data will be more secure with them, and it will be cheaper.

Furthermore, building requires higher technical capability than renting; not just in terms of maintaining a viable cloud, but to make it superior to the vendors’. You’d be beating them at their own game: for many vendors, cloud is their sole function, and they do it well.

Public Cloud use is becoming better value; companies in that space find their market niches and each are developing specific tools iterated on the cloud data. This can encompass everything from Machine Learning tools to “platforms as a service”. They’re very valuable, and it’s becoming more and more common for pricing to centre solely around what you use as a buyer.

However, the debate is not quite so black and white as that. Yes – it’s easier to buy; but in a sense, it’s also easier to not go out and compete for business in the first place. Several companies accessing similar vendors of cloud company products – eg. platforms as a service – risk homogenising the original product offering to customers.

To a greater and greater extent across all businesses and service lines, “customer experience” is more and more digital. The product is technology. If you differentiate yourself on technology, you must have technology which is different.

Perhaps the real question is, how digital is your business? Perhaps your business occupies a niche where the digital experience doesn’t matter so much. In which case, perhaps buying is more appropriate. Perhaps you’re competing solely on the basis of your digital experience; in which case, a bespoke cloud might be appropriate. For either approach, there is likely to be a compromise of homemade and bought-in cloud.

Once you have assessed your company’s market position, to determine which cloud vendor is right, it’s essential to assess the prospective cloud, including your own, against three variables; performance, agility, and cost.

 

Geography

There’s an idea, in swathes of the tech community, that data is ungovernable (except, perhaps, by blockchain). But if you’re a hammer, everything looks like a nail. Governments are still going to try.

There’s already a confusing legal soup in geographical data practice. Where data is stored, where data is available, and sometimes, where the data subject is from, or where the data subject is clicking from, are all different ways in which the laws can be applied to data. Multiplied by the number of countries that have data laws, it’s extremely difficult to comply.

But speed is important. Where possible, data should be kept within the country from which it is being accessed. Any look at abandonment rates on a website will tell you; reducing friction is paramount to success, and speed is a massive component. Reducing distance is important.

It’s also possible to “multizone” data. This makes compliance more of a headache (to take one example – in GDPR, more things to simultaneously delete at a customers’ request) but can improve speeds without compromising in other areas.

 

Leave a Comment