Why companies still want in-house data centres
Sometimes it seems as if the cloud is swallowing corporate computing. Last year businesses spent nearly $230bn globally on external (or “public”) cloud services, up from less than $100bn in 2019. Revenues of the industry’s three so-called “hyperscalers”, Amazon Web Services (AWS), Google Cloud Platform and Microsoft Azure, are growing by over 30% a year. The trio are beginning to offer clients newfangled artificial-intelligence (AI) tools, which big tech has the most resources to develop. The days of the humble on-premises company data centre are, surely, numbered.
Or are they? Though cloud budgets overtook in-house spending on data centres a few years ago, firms continue to invest in their own hardware and software. Last year these expenditures passed $100bn for the first time, reckons Synergy Research Group, a firm of analysts (see chart 1). Many industrial companies, in particular, are finding that on-premises computing has its advantages. A slug of the data generated by their increasingly connected factories and products, which Bain, a consultancy, expects soon to outgrow data from broadcast media or internet services (see chart 2), will stay on premises.
The public cloud’s convenience and, thanks to its economies of scale, cost savings come with downsides. The hyperscalers’ data centres are often far away from the source of their customers’ data. Transferring these data from this source to where they are crunched, sometimes half a world away, and back again takes time. Often that does not matter; not all business information is time-sensitive to the millisecond. But sometimes it does.
2023-10-05 07:47:55
Source from www.economist.com
rnrn