The world is heading towards a technological future where astronomical amounts of data will be collected, analysed and stored. Each day we currently produce massive amounts of data and this number is only going to increase as technologies like Artificial Intelligence and the Internet of Things become more involved with the efficient operations of organisations.
Visualizing in which direction data usage will develop in the upcoming years
Gartner predicts that by 2020 there will be over 20 billion ‘things’ connected to the internet, for this reason the need for easier and efficient access to data to make informed decisions is becoming increasingly important.
Recently, I spoke with Data Center News Asia on one of the most talked about topics in data center world – edge computing. This technology is fulfilling the need for reduced latency and increased efficiency by moving the data processing closer to where the data is created. Getting this right will be crucial to businesses being able to effectively deploy IoT, smart city and other services – and for ensuring people can use these services without issue.
According to a report by IBM, the world produces over 2.5 quintillion bytes of data each day. Considering the immense data usage and number of devices that are projected to be connected in the near future, it’s easy to understand why businesses big and small need to recognise the value of edge computing technology.
There are four that can help determine the infrastructure necessary to support edge computing depending on the client’s needs. The areas analysed to create the archetypes include scalability, rapid deployment, efficiency and flexibility.
Data Intensive edge technology is of benefit to businesses such as Netflix who consistently produce large amounts of data, it is impractical to continuously transfer it using the cloud, particularly when HD content delivery is involved. Netflix and other streaming services need an edge infrastructure that would reduce costs and latency and improve the overall customer experience.
Human-Latency Sensitive infrastructure is for human consumption and would be successfully applicable for retailers who need to quickly and accurately analyse customer data in near-real time. If data retrieval is delayed it could potentially reduce a retailer’s sales and profitability.
Machine-to-Machine Latency Sensitive provides infrastructure suitable for industries such as stock trading where machines are processing the data faster than humans, so the consequences for latency are higher than human-latency.
Life Critical Infrastructure support the needs for human health and safety in places such as hospitals. The system acknowledges critical need of speed and reliability are in these circumstances.
As the production of data and the need for its fast retrieval continues to increase and be used in more areas of business operations, the benefits of computing become clear. At the relatively early stages of the technology’s construction, it is important to create infrastructures that will be suitable to the varying data needs of businesses to ensure their customer experience is optimised.
Bottom line – the edge is essential to smart cities and any fast-developing technologies that will amplify the magnitude of data in the world. We need to get it right to keep our digital ambitions on track.
This article follows an interview in Data Center News Asia