In case you are still unfamiliar with the term: cloud computing is large-scale, shared IT infrastructure accessible over the internet. At the rise of this technology – a vital part of the Fourth Industrial Revolution – we’ve observed its numerous and undeniable advantages. Put shortly, “the cloud” allows corporations, governments, and public organizations to collect, store, and analyze data at an unparalleled scale.
Anyway, cloud computing has significantly increased energy consumption on a global scale. According to the Microsoft Cloud Carbon Study of 2018, each year, datacenters in the US alone consume the same amount of energy that 6 million homes do.
So, Houston, we obviously have a problem.
Cloud computing and carbon footprint
As briefly explained above, cloud computing offers some rather attractive opportunities for both service providers and their clients – including in the fields of environmental conservation and energy usage optimization.
Paradoxically enough, cloud computing technology itself had recently turned into a major factor in the ongoing climate change debate. Following the Paris Agreement and the world’s efforts towards sustainable behaviour, cloud computing’s environmental impact is increasingly scrutinized.
A possible solution suggested in the Journal of Parallel and Distributed Computing is sharing the cloud workloads among a so-called “multi-cloud” – multiple data centres located in different time zones and powered by renewable energy sources.
As promising as it sounds, this alternative does not perform great so far, and the issue continues to unfold.
- Read more – Top 100 CO2(carbon dioxide) Emissions by Country
How does Microsoft approach the issue?
As one of the chief tech giants on the planet, Microsoft steps up and tries to address both the environmental impact of cloud computing in general and the impact of their own “Microsoft Cloud” specifically.
Unsurprisingly, Microsoft’s official reports show significant energy efficiency improvements upon switching from traditional enterprise datacenters to the Microsoft Cloud. Anyway, the estimated savings (22% to 93%) don’t actually prove that cloud computing is doing great. They merely suggest that things might have been a lot worse without them.
Later in the 2018 report, the Microsoft team suggests factors such as IT equipment efficiency, data centre infrastructure, and renewable electricity use to be the key factors for reducing the cloud computing carbon footprint.
At the end of the day, Microsoft has defined a target of 70% renewable energy usage by 2023 – which is promising news.
How do other big companies address the cloud problem?
As giant as it is, the Microsoft corporation is still just a tiny part of the bigger picture. Some other leading tech companies that address the issue adequately and quite successfully include:
- Apple and it’s data centres that are entirely powered by renewable energy as of 2020.
- Google, which is also continually operating with 100% renewable energy.
This sounds great, indeed. But is it enough?
Meanwhile, on the other side of the coin…
Contrary to the “good guys” from Microsoft, Apple, and Google, some other tech giants are not paying too much attention to the cloud emissions issue. Not to point a finger, but here are the facts:
- Amazon Web Services (AWS) – one of the largest cloud service providers and proprietor of the “Data Centre Alley” – currently power only 12% of its data centres with renewable sources, as reported by Greenpeace.
- Alibaba – the leading cloud provider in China – had virtually done nothing to reduce its impact on nature. A recent report indicates that China’s data centres alone released 99 million tons of carbon dioxide in 2018 only. That equals the annual effect of extra 21 million cars.
Now that doesn’t sound too encouraging, does it?
How does reality look in facts and figures?
Despite companies’ struggles to reduce the carbon footprints from their clouds, every single day, a new device is connected. The rapid growth of the global cloud is still driving further demand for dirty energy. How does that theory look into practice?
According to Stackscale, one average internet minute in 2019 equalled to:
- About 188 million emails sent;
- About 41.6 million Messenger and WhatsApp messages received;
- About 4 million YouTube videos watched;
- About 3.8 million Google search queries;
- About 4 million Facebook likes;
- About 1.4 million Tinder swipes;
- About 347 thousand Instagram stories;
- And more, and more, and more…
Once again – that was an average for 2019 – before digital communication ever became an indispensable part of literally everyone’s daily routine – children and elderly citizens included.
So, try to do the math if you can.
The solution? It might be called “Tiny AI.”
The idea behind Tiny AI is simple yet genius. The emerging technology suggests the following: our devices no longer need to communicate with the cloud for us to use the latest AI-driven features successfully.
As of today, tech giants and academic researchers are developing novel algorithms that aim to shrink the current deep-learning models without decreasing their capabilities.
The key players currently engaged with the “Tiny AI” technology include Google, IBM, Apple, and Amazon to begin with. Some of them had already introduced some Tiny AI-driven features in their products, including Google Assistant, Siri, and Apple’s QuickType keyboard.
Tiny AI has the potential to improve the speed and efficiency of many current cloud-based services, along with giving an excellent foundation for developing some brand-new ones. And the best part about it? The cloud does not participate in the process; emissions are considerably cut, and developers don’t have to choose between benefits for the user and benefits the planet.
Because sometimes, an intelligent solution is all we need – as tiny as it might be.