We know that IoT is fast growing in Industrial sector. Today, Billions devices are connected to internet and generating huge amounts of data. To manage and analyze this huge amount of data, we used many tools, technique and time series databases. But we can’t analyze, store and manage everything in real time. In this scenario Edge Computing plays most relevant role.
Help of Edge Computing, we process data at end node source, That source can be an IoT device, IoT Gateway, Edge server, laptop/computer or smartphone. Actually we are trying to move intelligence to the edge and let edge nodes/devices do real-time analytics. In the way we can reduce application latency and saves network bandwidth. The architecture is distributed, as opposed to centralizing all processing in the cloud. It minimizes long-distance client-sever communication.
Gartner defines edge computing as,
part of a distributed computing topology where information processing is located close to the edge, where things and people produce or consume that information.
Hpe defines edge computing as,
Edge computing is a distributed, open IT architecture that features decentralised processing power, enabling mobile computing and Internet of Things (IoT) technologies. In edge computing, data is processed by the device itself or by a local computer or server, rather than being transmitted to a data centre.
What is an example of edge computing? 
Consider a building secured with dozens of high-definition IoT video cameras. These are “dumb” cameras that simply output a raw video signal and continuously stream that signal to a cloud server. On the cloud server, the video output from all the cameras is put through a motion-detection application to ensure that only clips featuring activity are saved to the server’s database. This means there is a constant and significant strain on the building’s Internet infrastructure, as significant bandwidth gets consumed by the high volume of video footage being transferred. Additionally, there is very heavy load on the cloud server that has to process the video footage from all the cameras simultaneously.
Now imagine that the motion sensor computation is moved to the network edge. What if each camera used its own internal computer to run the motion-detecting application and then sent footage to the cloud server as needed? This would result in a significant reduction in bandwidth use, because much of the camera footage will never have to travel to the cloud server.
Additionally, the cloud server would now only be responsible for storing the important footage, meaning that the server could communicate with a higher number of cameras without getting overloaded. This is what edge computing looks like.
Benefits of Edge Computing
- Reduced Latency: Edge Computing being able to sort critical data from the not-so-critical data is a reduction in latency (i.e., the time it takes to send data and receive a reply). Edge computing avoids roundtrip delay.
- Better Security: In the Edge Computing, Data is decentralized and distributed among the different IoT edge devices where it is generated. it’s difficult to take down the whole network or compromise all of the data with a single attack.
- Cost Savings: Lower data traffic and reduced cloud storage, in turn, leads to more efficient business operations.
- Greater Reliability: Cloud is good option but we can’t stay connected always with cloud. Edge devices/nodes can deal with temporary outages by storing or processing data locally.
- Scalability: While cloud infrastructure is built for scalability, data still needs to be sent to datacentres and stored centrally. Unlike cloud, edge computing allows you to scale your IoT network as needed, without reference to the available storage (or its costs).
- Interoperability: Edge node devices can act as intermediaries to interface legacy and modern machines.
Which are the technologies that enable computing at the edge of networks?
Cloud computing is essential for enabling Edge computing but edge computing does not replace cloud. They complement each other. According to requirment, engineers&developers must decide what data is best processed at the edge and what should go into the cloud.
Fog computing extends describes a decentralized computing structure located between the cloud and devices that produce data. This flexible structure enables users to place resources, including applications and the data they produce, in logical locations to enhance performance. We place compute and storage in the most logical and efficient location between the cloud and the origin of data. As part of the fog computing infrastructure, there are cloudlets and micro datacentres, which are simply edge servers clustered together to serve local storage or compute requirements.
Multi-Access Edge Computing (MEC) : In cellular networks such as 3G/4G/5G, Radio Access Network (RAN) That we used for handles radio or wireless resources and communication. MEC places compute and storage resources within the RAN to improve network efficiency and content delivery.
Difference between Edge Computing and Cloud Computing
How does edge computing differ from fog computing?
Edge Devices and Edge Nodes both are different. edge devices can be Sensors, Actuators, IoT devices and even IoT gateways. Edge nodes belong to the fog network that interfaces edge devices to the cloud. Therefore, fog computing depends on edge computing but edge computing is not depend on fog computing.
Fog computing extends describes a decentralized computing structure located between the cloud and devices that produce data. This flexible structure enables users to place resources, including applications and the data they produce, in logical locations to enhance performance.
Edge computing as any processing that’s close to the origin of data (where edge devices produced dtata). Any processing that happens on devices connected to the LAN or on LAN hardware is seen as fog computing. Investing in fog computing makes sense if data has to be aggregated from many edge devices.
What hardware is available to implement edge computing?
Important considerations for edge computing hardware include processing power, power source, memory, wireless connectivity, variety of ports/interfaces, reliability and ruggedness.
RISC processors such as ARM, ARC, Tensilica, and MIPS are preferred over CISC. While ARM Cortex is suitable, ARM also offers Neoverse specifically for edge use cases. ARM Cortex-M55 and Ethos-U55 are AI edge computing chips.
NVIDIA Jetson GPUs are designed for the edge. For example, Jetson Nano is a 128-core GPU. Jetson TX2 and Jetson Xavier are for industrial and robotic use cases. There’s also NVIDIA EGX Platform that offers GPU edge servers.
Intel has Movidius and Myriad 2. The latter is also part of Intel’s Neural Compute Stick (NCS) that draws power from host device via USB.
How are cloud providers enabling edge computing?
AWS Snowball – Snowball is a petabyte-scale data transport solution that uses secure appliances to transfer large amounts of data into and out of the AWS cloud. Using Snowball addresses common challenges with large-scale data transfers including high network costs, long transfer times, and security concerns.
Azure IoT Edge – Azure IoT Edge is a fully managed service built on Azure IoT Hub. Deploy your cloud workloads—artificial intelligence, Azure and third-party services or your own business logic—to run on Internet of Things (IoT) edge devices via standard containers.
Many Other options are available, if you are interested please Visit this : Best IoT Edge Platforms (Recommended)
What are the drawbacks of edge computing? 
One drawback of edge computing is that it can increase attack vectors. With the addition of more “smart” devices into the mix, such as edge servers and IoT devices that have robust built-in computers, there are new opportunities for malicious attackers to compromise these devices.
Another drawback with edge computing is that it requires more local hardware. For example, while an IoT camera needs a built-in computer to send its raw video data to a web server, it would require a much more sophisticated computer with more processing power in order for it to run its own motion-detection algorithms. But the dropping costs of hardware are making it cheaper to build smarter devices.