By Benjamin Brillat, Distinguished Engineer at Kyndryl
Edge computing, a distributed computing framework that moves computer storage and data processing closer to the source of data generation, such as users and devices, is helping more businesses improve the management and use of physical assets.
In fact, the global edge computing market is expected to grow to $16.5 billion by 2030, up from $11.2 billion in 2022, according to a new Grand View Research report.
However, the growth of edge computing comes with many questions and common misperceptions: Is edge going to replace the cloud? Is edge the same as IoT? Is security going to be a problem as the data moves farther away from a secure core in the cloud?
Here are five popular myths — and respective truths — to consider about edge computing.
Myth 1. Edge computing means the end of the cloud
Far from ending the cloud, edge computing is the furthest extent — the edge — of the cloud. By pre-processing data at the far edges of a network, edge computing reduces the amount of raw data sent to the cloud, which both optimizes bandwidth usage and reduces cloud processing costs. This proximity to data sources also reduces latency and enables real-time data processing, making it valuable for applications where low latency and quick decision-making are critical — such as pulling a patient’s medical images and test results in real-time, improving manufacturing quality control and maintenance, worker safety and retail loss prevention, to name a few.
Edge computing applies an optimized set of resources — compute, storage, network and location — to the task at hand. Edge computing complements cloud technologies by taking computing capabilities to the edge of the network, closer to where the data is generated and keeping the data compliant with regulatory requirements and only sending data to the cloud to process for further insights. Edge computing provides enhanced security with new security frameworks to be implemented, such as zero-trust network access.
Myth 2. Edge computing disrupts the convergence of IT and OT
The rise of edge computing plays a key role in bringing the information technology (IT) and operational technology (OT) worlds together, driven by the industrial sector undergoing digital transformation. IT, the systems that manage information and data, from software to hardware, are now interoperable with OT, the machines and systems that manage them. Edge computing acts as a bridge between the IT and OT worlds, facilitating seamless integration and communication between these traditionally distinct areas. With edge computing, data from OT devices can be efficiently collected, pre-processed and forwarded to cloud-based IT systems for further analysis, storage and long-term insights. Additionally, edge computing supports the use of smart technologies like AI algorithms and machine learning for real-time data analysis, enabling more intelligent and predictive capabilities at the edge.
Myth 3. Edge computing is only relevant for industrial use cases such as manufacturing, energy, mining and transportation
Edge has been around for decades. Its widespread adoption has increased significantly in recent years due to technological advancements like 5G, hardware capabilities, rise of IoT, mobile devices and real-time processing. But edge computing will benefit many industries. Take retail, for example. Edge computing enables brands to implement asset tracking and supply chain optimization. With many industries facing staffing shortages today, stores can now set up cashierless payment, such as mobile self-checkout (Apple Pay, Google Pay, etc.). Customers have a better user experience by being able to make purchases without long checkout lines. With access to (near) real-time data, retailers can push out more personalized promotions to their customers, which not only increases sales but drives up revenue per square feet.
Other industries such as government, healthcare and financial services are driving innovation with edge computing, too. For example, edge computing can help address data compliance mandates for insurance companies, banks and other financial institutions by bringing customer-facing applications and databases back on-premises, while in hospitals, edge computing enables better patient data management.
Myth 4. Edge computing and IoT mean the same thing
While both technologies work to capture data, edge and IoT are not the same. In edge computing, the data processing is done locally, while in IoT devices, the data is sent to the cloud for data analysis. IoT devices must be internet-enabled for proper functioning. With edge devices, this feature is optional.
IoT devices have few data processing needs, so they are best suited for simple tasks — like the smart speakers and thermostats we use at home. In contrast, edge devices run complex operating systems and they can support a range of data processing capabilities — think autonomous vehicles and enabling robot-assisted surgery. Each IoT device can perform a specific function only (one way communication), while a single edge device can handle more than one function (bi-directional communication).
Myth 5. Edge computing requires fast cellular network connectivity
One of the benefits of edge computing is you don’t always need connectivity. By processing data locally, the amount of data to be sent can be vastly reduced, requiring far less bandwidth or connectivity, while also reducing costs. This makes edge computing well-suited for use cases in remote areas or scenarios with limited connectivity. Other common use cases for edge computing where always-on connectivity is not needed include autonomous vehicles, healthcare devices and security surveillance/monitoring cameras.
Learn more about how Kyndryl and VMware jointly develop innovations to better reach and serve customers.