Edge Computing (EC) is about moving part of the service-specific processing and data storage from the Cloud Computing to the edge network nodes. Among the expected benefits of EC deployment in 5G, there are: performance improvements, traffic optimization and new ultra-low-latency services.
If today EC is getting momentum, we’re witnessing, at the same time, a growing development of Artificial Intelligence (AI) for a wide spectrum of applications, such as: intelligent personal assistants, video/audio surveillance, smart cities’ applications, self-driving, Industry 4.0. The requirements of these applications seem calling an AI’s resources-hungry model, whose cloud-centric execution appears in the opposite direction with a migration of computing, storage and networking resources at the edge.
In reality, the two technology trends are crossing in the Edge intelligence (EI): an emerging paradigm meeting the challenging requirements of future pervasive services scenarios where optical-radio networks requires automatic real-time joint optimization of heterogeneous computation, communication, and memory/cache resources and high dimensional fast configurations (e.g., selecting and combining optimum network functions and inference techniques).
Moreover, the nexus of EI with distributed ledger technologies will enable new collaborative ecosystems which can include, but are not limited to: network operators, platform providers, AI technology/software providers and Users.
A major roadblock to this vision is the long-term extrapolations of the energy consumption needs of a pervasive Artificial Intelligence embedded into future network infrastructures.
Low-latency and low-energy neural network computations can be a game changer. In this direction, fully optical neural network could offer impressive enhancements in computational speed and reduced power consumptions.