With employees encouraged to work from home and many citizens subject to some form of lockdown throughout 2020, technology and IT have taken on the importance of a utility along with water, gas and electricity. Without connectivity, telecommunications and the cloud, many businesses would have ground to a halt, while consumers would have had limited means of keeping in touch, buying essential goods, and staying entertained. But thanks to IT, businesses such as supermarkets, broadcasters and financial services have continued to thrive.
Online shopping has become the norm, particularly for essential goods with online traffic growing by over a third globally in the supermarket sector. Subscriptions to on-demand streaming platforms are through the roof – with Netflix adding 26 million subscribers to its platform by the end of June. Social media, video communication and instant messaging platforms have extended their user bases. The outstanding beneficiary here has been Zoom – posting a 335% increase in revenues compared to 2019. Everything from the way we receive medical advice, check our bank balance and exercise are driven by a digital-first approach. So, what is the impact of IT going from being in the background to be a core part of the backbone? For the technology industry, the impact is profound. There has been a collective and widespread epiphany of IT’s value to both the economy and society. While this is music to the ears of cloud, connectivity and Software-as-a-Service (SaaS) providers, with great power comes great responsibility.
No more downtime
If IT is to live up to this critical infrastructure status, availability must be a given. Think about the frequency with which power cuts or empty water taps actually occur. These are infrequent events which still cause surprise and generate headline news. Can we honestly say the same about the availability of IT services? Think about how often routers need rebooting and applications fail to respond to basic commands. Furthermore, cyber-breaches occur on a daily basis – with some statistics suggesting around 30,000 websites are hacked every day. For technology to be elevated to utility status, there needs to be an agreed level of service to which providers are held accountable by independent regulators. Simply put, ‘this page cannot be displayed’ and ‘computer says no’ moments have to become a thing of the past. While in principle, such a scenario may seem unattractive to technology giants, this expectation is befitting of the vital role technology plays in almost every aspect of our lives today.
Beyond the possibility of opposition from Silicon Valley, there are other challenges to consider with regulating technology. Using the examples of social media and search, enforcing a level of service for something that the consumer does not pay to use would be an almost unprecedented move. However, subscription-based SaaS models lend themselves well to such regulation. Arguably, that regulation already exists and is called a Service License Agreement (SLA). These are set by the service provider, which is legally obliged to fulfil the SLA once a contract with a customer or partner has been signed. Given the impact of downtime on businesses, we are already seeing customers demand more of their provider.
According to Veeam’s 2020 Data Protection Trends Report, 95% of global organizations suffer unexpected outages – lasting an average of almost two hours. For High Priority applications, which account for over half of a company’s applications, one hour of downtime is estimated to cost $67,651. That means for an application such as email, payments, websites and mobile apps, one outage costs an average of over $135,000. While companies can fight the case for compensation, change providers if they are dissatisfied, or demand urgent maintenance of a system that causes downtime, there is no one size fits all insurance model to protect businesses. A step towards tighter regulation of technology and telecoms could be a set of minimum service requirements, including a maximum amount of downtime allowed, time to recover data and applications, frequency of software upgrades.
Securing tech’s reputation
When we talk about downtime and other glitches which possibly threaten technology’s status as a utility, we turn to cybersecurity. The growing importance of IT in the world’s day-to-day operations is an opportunity that cyber-attackers will pull out all the stops to exploit. Anything that is connected can be hacked. So, what does that mean in a world where everything is connected? What it means is that cyber-attacks have risen again in 2020. Microsoft’s 2020 Digital Defense Report shows that Office 365 alone has blocked 1.6 billion URL-based email phishing threats in the past twelve months. Of 6 trillion messages scanned for viruses, 13 billion malicious emails were blocked. This supports Veeam’s own research, with IT leaders naming cyber threats their biggest challenge in the next 12 months – above issues such as a shortage of skills and the ability to meet customer demands.
The penalties for businesses who fail to secure their systems and data are already high. As well as the financial cost of downtime, loss of confidence from customers and reputational damage leave a distasteful legacy, and something businesses can’t always recover from. All of this once again points towards the utility-like status of technology – in this case with specific reference to cybersecurity and data protection. Perhaps the question needs to change from what security provider is a business using to what security protocols should businesses be required to implement based on the data they are processing? The General Data Protection Regulation (GDPR), which applies to all EU citizens’ data, goes some way to implementing a universal framework. But implementing cybersecurity measures is a choice rather than an enforced necessity. As cybersecurity becomes a utility that businesses need rather than a layer of IT they can choose, there is an opportunity to institute best practice across the board. Will cybersecurity training for office-based employees become mandatory, particularly with the rise of the remote workforce? Should all organizations publish a full disaster recovery plan, which details how they will recover data should it be lost or stolen? Going further, should personal data held by organizations be subject to a universal cybersecurity standard to ensure all citizens’ data is protected to a satisfactory level?
Like the ongoing cybersecurity battle, the trend of technology permeating every aspect of our working and person lives pre-dates 2020. However, this is undoubtedly a watershed moment for technology in the way it is perceived and the opportunity for the industry to demonstrate responsibility. We have already seen the ‘techlash’ aimed at companies that fail to protect data or use it ethically. At the same time, business leaders and people across the world have come to realize that having access to the Internet is the new ‘keeping the lights on’. Our economies, societies and lives are enriched by the ability to communicate, share content and complete transactions online. The result is that technology’s role in the world has evolved into that of something which is expected to be ubiquitous, always-on and permanently available. The world simply will not accept ‘this page cannot be displayed’ anymore.
- The Evolution of Zero Trust - February 9, 2024
- How Much Protection Does Cyber Insurance Really Give Businesses? - March 2, 2023
- Breaking the Myths of the Zero Trust Model and Understanding How It Can Protect Your Organization - August 18, 2022
View Comments (0)