What is Edge Computing?
When saying ‘the edge’ one could think of a cliff.
Others may think of the cutting-edge.
Perhaps even, Lady Gaga pops up in your head.
Yet, when combining edge and computing what do you get?
“I’m on the edge of glory,” may seem equally as abstract.
After having written about artificial intelligence and data centres for quite some time I thought it was strange that I did not properly understand what edge computing is. So, here it goes: an article exploring that question what is edge computing is.
The short of it can be found through an easy search online.
“Edge computing is a distributed computing paradigm that brings computation and data storage closer to the location where it is needed, to improve response times and save bandwidth.”
In addition, this illustration brought more clarity, at least to understand what edge computing is on the surface.
As can be seen from below to the top it is about delivery networks.
Content delivery had to change in the late 1990s to ensure that ensure the greater degrees of content (such as video or images) were deployed close to users.
These networks evolved in the 2000s to host applications and application components.
This approach has developed through virtualization technology.
“Edge virtualization is the latest data center equipment trend, and several vendors have developed virtualization tools to help organizations maximize infrastructure use. Edge computing extends compute processing to new venues by moving system processing and intelligence out from the data center and closer to endpoints.”
One definition is that it delivers ‘low latency nearer to requests’.
In this sense it is happening to the edge of the network.
Edge computing operates on ‘instant’ or ‘real-time data’ generated by sensors or users.
Edge nodes: used for game streaming are known as gamelets.
“…the edge node is mostly one or two hops away from the mobile client to meet the response time constraints for real-time games”
Increasingly Internet of Things (IoT) devices need compute.
This pushes network bandwidth requirements.
The aim of Edge Computing is to move the computation.
Thus, moving away from data centres towards the edge.
This can be done by using ‘smart objects’ or gateways to perform tasks and provide services — nodes.
However, distributing logics in network nodes means new challenges.
One is pertinent: security.
If you can break a device, you may break in — or between in transit.
Edge notes can be constrained devices without the same resources.
As such, companies attempt to shift risk or ownership of the data collected.
This risk is shifted from service providers to end-users.
To scale or distribute one first has to realise that devices are different (heterogenous).
There are different performances or energy considerations and advanced security could slow down transfers (as such a trade-off is introduced).
How reliable or how fast? How connected and how divided?
Operability or interoperability?
Contingency in case of failures may be important, and one needs to be aware of the distribution to detect any issues/failures/breaches.
If an edge system is well-designed it can speed up the process.
Still, as mentioned there is that possible trade-off in terms of risk or increased responsibility in distributed systems.
AI can as such run on the ‘edge’.
Using resource-rich machines called cloudlets near mobile users.
A cloudlet is a mobility-enhanced small-scale cloud datacenter.
As mentioned ‘offloading’ is important and can be used depending on workload.
Device and node to cloud.
Car and node to cloud(s).
Sensor and node to cloud.
On one hand it is not this simple, of course.
On the other hand I hope this brought you closer to the understanding the concept of edge computing.
This is #500daysofAI and you are reading article 357. I am writing one new article about or related to artificial intelligence every day for 500 days. My focus for day 300–400 is about AI, hardware and the climate crisis.