Skip to Main Content
VISUALIZE | INSIGHTS THAT POWER INNOVATION

Edge computing is coming: What is it?

Technology has gone a long way in defining the developments of the past decade. Ten years from now, our retrospective on the 2020s may very well be the same. The ISO Emerging Issues team covers technological breakthroughs to glean a sense of how the risks of the future may develop. Among other technologies, we have covered and continue to monitor quantum computing (see our Quantum page here), artificial intelligence, and 5G (see our 5G page here).

Emerging Issues Weekly Digest

Edge computing may be around the corner, delivering with it a host of associated benefits and risks.

Could edge computing have a significant impact on emerging technologies, as well as many others?

What is edge computing?

According to Techopedia, edge computing "is defined as the deployment of data-handling activities or other network operations away from centralized and always-connected network segments, and toward individual sources of data capture, such as endpoints like laptops, tablets or smartphones." 1 Network World notes that this capability could allow for organizations to analyze important data nearly instantaneously.2

Here are some of the benefits that could coincide with the deployment of edge computing.

Increased bandwidth

To be clear, cloud computing will still be part of the picture. According to a white paper drafted by the U.S. Federal Communications Commission (FCC), edge computing will augment, rather than supplant, the cloud.3

The edge could serve as a first line of defense for a growing tsunami of data and information that is currently streamed to the cloud. In some cases, the edge could process data that it can handle, as well as comb through other data to prevent unnecessary data from being delivered to the cloud and clogging the internet pipes in the process.

Here's one example from Codeburst.io: An oil rig in the ocean that has thousands of sensors producing large amounts of data, most of which could be inconsequential; perhaps it is data that confirms systems are working properly. That data doesn't necessarily need to be sent over a network as soon as its produced, so instead, the local edge computing system compiles the data and sends daily reports to a central data center or cloud for long-term storage. By only sending important data over the network, the edge computing system reduces the data traversing the network.4

By being more efficient in determining what gets directed to the cloud, greater bandwidth could be available to optimize the capabilities of the increasing set of connected and powerful devices.

Reduced latency

As TechCrunch notes, in order for technologies requiring immediate response times (self-driving cars, for example as noted in our post here) to thrive, any amount of latency is too much latency.5 This is a concept we touch on in our 5G overview, a technology that could also usher in an era of minimal latency.

While describing the sequence involving how a voice assistant operates, The Verge effectively illustrates why operating tasks through the cloud produces latency.6

"Voice assistants typically need to resolve your requests in the cloud, and the roundtrip time can be very noticeable," notes The Verge. "Your Echo has to process your speech, send a compressed representation of it to the cloud, the cloud has to uncompress that representation and process it — which might involve pinging another API somewhere, maybe to figure out the weather, and adding more speed of light-bound delay — and then the cloud sends your Echo the answer, and finally you can learn that today you should expect a high of 85 and a low of 42, so definitely give up on dressing appropriately for the weather."

With edge computing capabilities, the voice assistant's responses would no longer be dependent on a distant server but rather could address your needs locally on the device, leading to a faster response. In the world of tech, quicker is better.

Increased capabilities

As noted in Network World, with reduced latency, edge computing paired with artificial intelligence could enable more robust real-time processing in impactful services. For example, in addition to self-driving cars, ZDNet also points to real-time securities markets and transportation traffic routing as other offerings that are optimized with edge computing.7

Cyber angle: Pros and cons

You didn't think we could discuss a potentially groundbreaking technological breakthrough without examining its cyber implications, did you?

Being able to perform computations on a device could offer both benefits and risks. The Verge notes that keeping information such as biometrics on the device could allow for greater privacy. Additionally, we have been posting on some vulnerabilities with cloud computing, including a significant breach that resulted in 106 million stolen Capital One accounts. If less data is on the cloud, it stands to reason that less data would be vulnerable in the event of a breach.

However, some cyber risks could coincide with the deployment of edge computing. A research director at Gartner told ZDNet that: "Security at the edge remains a huge challenge, primarily because there are highly diverse use cases for IoT, and most IoT devices don't have traditional IT hardware protocols. So the security configuration and software updates which are often needed through the lifecycle of the device may not be present." 8

The article notes that this issue is tricky to navigate. A director of technology at a cyber defense company explained that devices that incorporate edge computing could be disincentivized from improving security because: "As soon as you start trying to design a system to be resistant to an attacker attempting to gain physical access, you usually end up designing something that isn't edge computing anymore—because if the data doesn't reside on the device, but a cloud or a data centre, the data lives somewhere else."

Nonetheless, if a balance is struck, there do appear to be avenues available to shield these devices from cyberattacks. The FCC white paper notes that "[w]ith software-defined networking,… [which requires] local processing to find the best route to send data at each point of the journey… it's possible to develop a multi-layered approach to security that takes the communication layer, hardware layer and cloud security into consideration simultaneously."

Moving Forward

Edge computing is reportedly projected to take off in use in the coming years. One set of analysts, according to ZDNet, forecasts that the market will surge from $1.47 billion in 2017 to $6.72 billion by 2022, a compound annual growth rate (CAGR) of 35.4 percent. This growth, according to the article, may be the result of 5G deployment as well as the continued application of the IoT.

Additionally, the FCC notes that right now, "about 10 percent of enterprise-generated data is created and processed outside a traditional centralized data center or cloud." With the increased use of edge computing, this number may reach 75 percent by 2022.

These forecasts appear to indicate that edge computing may be around the corner delivering with it a host of associated benefits and risks.

To learn more about potential emerging risks facing insurers, sign up for the ISO Emerging Issues Weekly Digest.


David Geller

David Geller, CPCU, SCLA, is product strategy manager, Underwriting Solutions at Verisk. He can be reached at David.Geller@Verisk.com.


Visualize Subscribe

Get the best of Visualize!

We'll send Visualize Monthly, and our most popular content, right to your inbox.

Subscribe now

  1. "What Does Edge Computing Mean?" Techopedia, < https://www.techopedia.com/definition/32472/edge-computing >, accessed on June 25, 2021.
  2. Keith Shaw, "What is edge computing and why does it matter?" Network World, November 13, 2019, < https://www.networkworld.com/article/3224893/what-is-edge-computing-and-how-it-s-changing-the-network.html >, accessed on June 25, 2021.
  3. "5G Edge Computing Whitepaper," U.S. Federal Communications Commission, < https://transition.fcc.gov/bureaus/oet/tac/tacdocs/reports/2018/5G-Edge-Computing-Whitepaper-v6-Final.pdf >, accessed on June 25, 2021.
  4. "What Is Edge Computing? The Quick Overview Explained With Examples," Codeburst, June 24, 2019, < https://codeburst.io/what-is-edge-computing-the-quick-overview-explained-with-examples-bc8e1ec5b9a0 >, accessed on June 25, 2021.
  5. Ron Miller, "Intel's latest chip is designed for computing at the edge," TechCrunch, February 7, 2018, < https://techcrunch.com/2018/02/07/intels-latest-chip-is-designed-for-computing-at-the-edge/ >, accessed on June 25, 2021.
  6. Paul Miller, "What Is Edge Computing?" The Verge, May 7, 2018, < https://www.theverge.com/circuitbreaker/2018/5/7/17327584/edge-computing-cloud-google-microsoft-apple-amazon >, accessed on June 25, 2021.
  7. Scott Fulton III, "What is edge computing? Here's why the edge matters and where it's headed," ZDNet, June 1, 2020, < https://www.zdnet.com/article/where-the-edge-is-in-edge-computing-why-it-matters-and-how-we-use-it >, accessed on June 25, 2021.
  8. Danny Palmer, "Edge computing: The cybersecurity risks you must consider," ZDNet, October 1, 2018, < https://www.zdnet.com/article/edge-computing-the-cyber-security-risks-you-must-consider/ >, accessed on June 25, 2021.

You will soon be redirected to the 3E website. If the page has not redirected, please visit the 3E site here. Please visit our newsroom to learn more about this agreement: Verisk Announces Sale of 3E Business to New Mountain Capital.