ISSIP

Edge Computing: What It Is and What Its Future Holds For Us

By Dr. Christine Ouyang

Edge computing is a distributed computing paradigm in which data processing and decision-making occur at or near the data source. Let me explain in a simple language. If a centralized data center or cloud is our brain, the edge is our limbs, ears, eyes, nose, tongue, and/or skin. While sophisticated neural processing occurs at the lowest level of our motor hierarchy when we withdraw our hand from a hot iron, the “Automatic” reflex enables us to act quickly to protect ourselves from burning without us having to think about it consciously in our brain. Similarly, in edge computing, computing resources such as servers and storage are placed closer to the devices and sensors that generate data to allow for real-time or near-real-time information processing, decision-making, and action-taking. Some of the key advantages of edge computing include fast responses, reduced latency, improved data security and privacy. These advantages attribute to the ability of edge computing to manage, store, and process a large amount of data that is often sensitive at the edge rather than transmitting it to centralized data centers or cloud.

Just as the reflex can be modulated by higher levels of our motor hierarchy, edge computing can be regulated or modulated by cloud computing. For example, an artificial intelligence (AI) model can be trained and maintained in the cloud. The model is then deployed to thousands and hundreds of thousands of endpoints (for example, the Internet of Things (IoT) devices) at the edge for inferencing. This contrasts with Federated AI where the model is trained at the edge without aggregating the data to the central cloud environment.   In the continuum from edge to cloud, there could also be regional data centers or network edge or edge cloud acting as our peripheral nerve systems and spinal cord for relaying signals (such as policies, data, and models) and making decisions.

Edge computing may sound like a complex concept, but its superior ability to offer low latency and enhanced data security means that it could be widely used with many emerging technologies, such as IoT, AI, robotics, digital twins, and 5G, to name a few. It is also noticeable from a growing number of use cases of edge computing that many businesses have started to leverage edge computing to improve safety, efficiency, and customer experiences. Some of the examples include data-intensive and time-critical applications for autonomous vehicles, industrial automation, healthcare, personalized experience in retail, smart home/building/cities/nations, virtual reality, and augmented reality. However, most edge solutions today are highly customized or First-Of-A-Kind (FOAK) for each individual use case, making them difficult to scale.

The market adoption for edge computing is growing at a phenomenal rate, but to amplify its impact on business and society, we may need to come up with innovative approaches that effectively supports the growing adoption across many industries. For example, we may consider build the ability to manage the fragmentation of systems and standard caused by edge computing; the ability to create edge platforms or solutions that can be consumed as services so that they scale up across different industries; and the ability to extend edge computing as a holistic and horizontal technology for efficient use of resources that results in reduced operational cost. Moreover, the advanced standardization in edge computing would improve its interoperability, which, as a result, could enable its integration with other devices and systems more seamlessly.

Finally, when I think of skills for the future, we need a broad set of skills and expertise for edge computing, including networking, IoT, programming, cloud computing, security, data management, AI and even Blockchain, which makes my brain hurt!