How will Artificial Intelligence Drive the Growth of Future Data Centers?
Artificial Intelligence (AI) has been around for quite some time now, disrupting businesses and sectors with its capabilities to boost performance and bring in operational efficacies. The data center industry is no exception. In today’s time, data holds a huge significance for any organisation, and what’s equally important is managing that data effectively. Once filtered and crunched, the harvested data proves vital to making strategic business decisions for companies. Hence, companies are investing in advanced automation tools for data processing and migrating to hyperscale data centers to upgrade their IT infrastructure. Explosion of data in recent years has led hyperscalers to innovate and deploy AI technologies in their facilities to handle tasks autonomously.
The use of automation technologies in data centers is hardly new. For example, Google has explained the use of DeepMind AI for cooling. However, companies are yet to leverage AI/ML to the fullest. Factors such as distrust in technology have obstructed many organisations to take a leap towards AI. While the most known use cases for AI deployment in data centers are temperature control and predictive maintenance, AI’s potential to enhance the efficiency of a data center infrastructure is far more than widely known. Let’s look at some use cases of AI in data centers that will change the future of the industry.
As data center workloads move upward with an increase in data, many businesses are looking towards AI to boost efficiency and cut expenses. AI can be used to determine the workload movement in a hybrid setting in real-time to the most efficient infrastructure that could be cloud, on-premise or edge infrastructure. As AI makes its way into the data center industry, organisations are adopting innovative approaches to handle their data to allow more use of robust AI techniques and analytics.
Gartner predicts that by 2025, 70% of organisations will shift their focus from big to small and wide data; this will facilitate more context for analytics and make AI less data-hungry. Small data approach provides useful insights with fewer data, whereas broad data offers analysis performed on various large, unstructured data in diverse data formats. Together, both approaches allow advanced analytics and AI, reducing the reliability of big data.
Mitigating people shortage
Automated technologies in data centers promise less human intervention in regular and repetitive tasks. It frees up staff from mundane activities such as storage optimisation, cooling distribution, security settings and so on and allows them time to focus on more critical issues. It not only achieves greater efficiencies but also reduces the risk of human errors while handling complex and diverse workloads. For example, at our Yotta NM1 data center, in case of a leakage in a chiller pipe, a sensor-enabled Leak Detection System not only diagnoses the leakage but also mitigates the problem in real-time without manual intervention. Upon detection of leakage in the chiller pipeline, the system automatically diverts the water flow from an alternate pipeline. All this can be managed without running the risk of downtime in the data center. Automation is creating a pathway for data centers to go from reactive to preventative, leading to predictive.
Maximising power efficiencies
Power consumption is one of the most critical issues for data centers across the globe. Energy costs surge by at least 10% every year; its increased use in high-density servers is also not sustainable for the environment. Deploying AI/ML technologies can be a solution to the increased energy use in data centers. Systems in data centers generate significant heat; traditionally, air conditioners, chillers, water pumps are controlled by Building Management System (BMS) to keep the temperature in check. However, it is not energy efficient. AI-based power management can help optimise cooling systems by analysing historical data and creating a Power Usage Effectiveness (PUE) prediction model, cutting power costs and improving efficiency.
In a hyperscale data center, where several events occur together, it is nearly impossible for humans to monitor and alert everyone in case of threat situations. AI-powered tools have proven useful in such areas. For example, image and sound recognition capabilities are being used widely to enhance the physical security of a data center facility; AI analytics is being used as a video surveillance solution to make sense of data collected by security cameras. Machine Learning techniques are also being leveraged for anomaly detection, where the system is trained to identify usual patterns and detect the irregular ones.
For example, again, at our Yotta NM1 data center, we have AI integrated security cameras keeping an eye over critical locations. These cameras will raise a ticket to the security control room of the data center if they detect more than 10 people gathered at the front gate or if they detect an unattended object in the premises for a certain period of time. Without automation, it would take hundreds of security personnel to watch over the physical security of the data center 24/7, which can often span into acres. This technique is also helpful in predictive analysis, where the AI system flags off any unusual occurrence in advance to be checked before a system completely breaks down. Hence, data center security can be strengthened by using AI for self-learning threat detection and monitoring algorithms.
Data centers of the future will certainly be more AI dominant, and almost all functions in the facility will be automated. Though these technologies are only in the hands of a few large hyperscalers and enterprises, they will soon trickle down to other data center players as technology upgrades, trust grows, and costs are cut down. Moreover, given the digital adoption accelerated by the pandemic, these advancements will be seen in practice sooner than later.