12 September 2017

The “Operating System” model for the Digital Society

We are witnessing a number of techno-economic drivers (e.g., global and low costs access to IT and network technologies, moreover accelerating) which are creating the conditions for aCambrian explosion” of new roles, services, value chains, etc… This is true for Telecommunications/ICT and also for several social contexts (e.g., Smart Cities) and industrial ecosystems (e.g., Industry 4.0).

We realize that Telecom infrastructure will have to “tame” a growing “complexity” (e.g., hyper-connectivity, heterogeneity of nodes and systems, high level of dynamism, emerging of non-linear dynamics in feedbacks loops, possible uncontrolled interactions); they will have to be very effective, low-costs and self-adaptable to highly variable context dynamics (e.g., needs of changing strategies with other Players, any-services fast-provisioning and adaptive enforcement of biz policies to end-Users and Vertical Apps requirements, local-vs-global geographical policies, etc).

We’ve been mentioning several time that in order to face such challenges, we need proper innovative paradigms (e.g., based on DevOps, adopting Computational Intelligence, capable of scaling to millions of VM/Containers), to manage the future Softwarized Telecom infrastructures (i.e., based on SDN, NFV, pursuing decoupling of HW from SW, virtualizations anc Cloudification-Edgification of functions and services). And this implies challenges not only technical/engineering but also related to governance, organization, culture, skills, etc…

Now let’s open this vision to extend the concept of infrastructure beyond the Telecoms. Also a Smart City has its own physical infrastructure, which is heterogeneous and includes a complex variety of resources, whose dynamics are intertwined; but also a smart factory in I4.0; they will have to be very effective, low-costs and self-adaptable to highly variable context dynamics.

So my take is that we are facing a sort of non-linear phase transition of a complex system (the intertwining of our Society, Industries, Culture…) whose control variables include (hyper-connectivity, globalization, digitalization, etc). How extracting value from this phase transition?

The models of an Operating System (OS) would represent - for any Industry adopting it – the “strategic and unifying approach” to manage this phase transition. Not only it allows taming the complex oscillations of this transition but also it extracts dynamically value from them, creating and running ecosystems, even new ones.

In the very essence, this requires  virtualization/abstraction of all resources/service/functions (e.g., in broad sense including the ones of a Smart City or a I4.0 Factory) and their secure APIs accesses from both End-Users/Developers, Third Parties and other related Operators.


The future sustainability of the Digital Society is about the flourishing and running of 5G Softwarised Ecosystems.

My take is that we need a system thinking to design this Digital Society OS, capable of enabling dynamical trade-off Slow-Cheap to Fast-Costly vs Flexible-General to Inflexible-Special.

Eventually, look at how Nature implemented it... with a very distributed and resilient approach.


08 September 2017

Talking the language of Softwarization: towards Service2Vectors (part 2)

SDI functions and services modularization can be achieved through Network and Service Primitives NSP: this will increase the level of flexibility, programmability and resilience of the SDI, for example improving agility in software development and operations when using DevOps approaches. On the other hand, there is a cost to pay: it increases the level of complexity of the SDI.

Then, management, control and orchestration (and in general all the OSS/BSS processes) of a SDI should deal with an enormous number of NSP which have to be interconnected/hooked and operated to implement (the logics of) network services and functions. Moreover said NSP should be continuously updated and released.

This can be simplified and above all automated by using a dynamic multi-dimensional services space where coding a distributed representations of all NSP of a SDI. Remeber what is done, for example, in the approaches adopted for the word embedding in Natural Language Processing (NLP). For example see this tutorial on the word2vec model by Mikolov et al. This model is used for learning vector representations of words.

Leveraging on this thinking, I’ve invented a method (service2Vectors) for the distributed representation of NSP with a vector of several elements, each of which is capturing the relationships with other NSP. So, each NSP is represented by a distribution of weights across those elements of the vector, which comes to represent in some abstract way the ‘meaning’ of a NSP. Said NSP vectors can be seen as single points in a high dimensional service space This multi-dimensional space can be created and continuously updated by using Artificial Intelligence (A.I.) learning methods (e.g., recurrent neural networks).

In a SDI there might be thousands or even more different NSPs: all of them create a sort of vocabulary whose terms can be used for expressing  the SDI services (for example through an intent-based language, example below). Let’s assume for example that this vocabulary of NSP has 1000 elements, then each vector representing an NSP will have V = 1000 elements, then the NSP can be represented by a point in a space of 1000 dimensions.

This distributed representations of NSP in a multi-dimensional services space allow A.I. learning algorithms to process the “language” (e.g., intent-based language, example below) used by Application and Users to formulate service requests to the SDI. In fact, NSP vectors can be given as inputs to a recurrent neural network which can be trained, for example, in order to predict a certain service context given a NSP and/or vice-versa a NSP given a certain service context. The learning algorithm could go, for example, through sets of thousands of services context (existing compositions of NSP).

Once the recurrent neural network is trained to make said predictions to some level of accuracy, the output is the so-called space matrix of the trained neural network, capable of projecting any NSP vectors into the space. NSPs with similar context tend to cluster in this space; for example this matrix can be queried to find relationships between NSPs, or the level of similarity between them.

Another alternative is providing distributed representation of SDI service (instead of the single NSP) with a vector of several elements, each of which is capturing the relationships with other SDI services. So, each SDI service is represented by a distribution of weights across those elements of the vector. Said SDI service vectors can be seen as single points in a high dimensional service space This multi-dimensional space can be created and continuously updated by using Artificial Intelligence (A.I.) learning methods (e.g., recurrent neural networks).

This reminds what Prof. Geoff Hinton argued by introducing the term "thought vector": “it is possible to embed an entire thought or sentence — including actions, verbs, subjects, adjectives, adverbs etc. — as a single point (i.e., vector) in a high dimensional space. Then if thought vector structure of human language encodes the key primitives used in human intelligence then SDI services vector structure could encode the key primitives used by “applications intelligence”.

Moreover thought vectors have been observed empirically to possess some properties: on for example is known as "Linear Structure": i.e.,  certain directions in thought-space can be given semantic meaning, and consequently the whole thought vector is geometrical sum of a set of directions or primitives. In the same way certain directions in the SDI service space can be given a context meaning, and consequently whole SDI services vector can be seen geometrical sum of a set of directions or primitive.

Hopefully this will pave the way for  Humans and not-human Users (apps, avatars, smart or cognitive objects, processes, A.I. entities, ....) "to talk" with Softwarised Infrastructures, with a common, quasi-natural language.

06 September 2017

Talking the language of Softwarization: towards Service2Vectors (part 1)

Cost reductions and new revenues flows are key business drivers for the sustainability of Network Operators. Telecommunications Infrastructures are growing in heterogeneity and complexity, but they should be at the same time agile and flexible, reliable and programmable...to cope with market dynamics (wih increasing "frequencies").
This is a new cycle of "complexity", in one word. It is ever-growing in Nature, by definition, at least up to when a "tool" is found to "tame it" and to make a "phase transition" to a new "state".
We know, that recent advances in enabling technologies such as SDN and NFV are offering the enablers of decoupling the hardware and software architectures and introducing the virtualization of resources (the so-called Softwarization of Telecommunication infrastructures). At the same time, the evolution of Cloud Computing towards Edge and Fog Computing, Artificial Intelligence, multi-level APIs, etc etc represent other technology trends which are intercepting SDN and NFV in shaping future Software Defined Infrastructures.
Still management, control and orchestration systems of SDI should make sure that the infrastructure services, characterized with specific KPI (Key Performance Indicators), are provisioned to Applications and Users upon their specific requests. But this implies carrying out operational tasks in a new way: management and control of both physical (e.g., physical nodes and IT servers, physical connections) and (millions of) virtual resources (e.g., virtual machines or containers, virtual links and virtual network functions), scheduling and end-to-end orchestration of  virtual network functions, and services...etc. In fact, Softwarization means that virtualized network functions and services can be dynamically allocated and even executed in the Cloud Computing and/or Edge Computing (e.g., in centralized Data Centre and/or in mini-Data Centre, which can be located in correspondence of network PoPs equipped with processing and storage capabilities).
Also, this is allowing to exploit the model of  Service Chaining (also known as Service Function Chaining) in SDI. In general, Service Chaining is about creating and provisioning a network service as a sequence or chain of interconnected network functions and services, by hooking the logical resources where they are executed through the steering of the traffic.
Obviously, just like applications, said network functions and services of a SDI can be modeled and developed by combining software tasks and/or Microservices and network primitives. As known an application can be modeled as a core part containing the application logic and adapters that interface the application with the external world. Examples of adapters include database access components, messaging components that produce and consume messages, or web components that either expose APIs or implement a User Interface (UI). Instead of developing monolithic application, Microservices architectural paradigm proposes split the application into set of smaller, interconnected services (called Microservices). Microservices are basically modular software components each of which runs a unique process which can be deployed independently, with minimal centralized management. For  example some Microservices can expose an API that’s consumed by other Microservices or by the application’s clients, other Microservices can implement a web UI.
One advantage is that these smaller components can be developed independently and scaled independently: this is improving agility in software development and operations, promoting resilience and scalability. Microservices can be used also for developing network and service functions in SDI: in fact a Virtual Network Function can be decomposed in a sequence/combination of Microservices and/or network primitives.
Generalizing, Microservices could be seen as any kind of packet processing primitives (also called network or service primitives) which could be dynamically composed to be executed in different hardware architectures. Example of said packet processing primitives could be: packet forwarding, packet inspection, modification (including dropping a packet), queuing, flow control and scheduling, or any other software tasks/functions (such as those one required to create any VNF), to provide access to nodes (e.g., node address, interfaces, link status) and its storage an processing resources. 
Part 2 to come next !

01 September 2017

The rise of a Networked AI with humans-in-the-loop

The programmability, flexibility and high levels of automation of 5G operations will reduce costs (e.g., OPEX) and create new service paradigms which might be even beyond our imagination. Some examples concern the applications of the Internet of Things, Tactile Internet, advanced Robotics, Immersive Communications and, in general, the X-as-a-Service paradigm.

Let us consider some examples. Cloud Robotics and 5G-controlled robotics will have huge impacts in several sectors, such as industrial and agricultural automation, in smart cities and in many domestic applications. In agriculture, autonomous machines will be used for tasks like crop inspection, the targeted use of water and pesticides, and for other actions and monitoring activities that will assist farmers, as well as in data gathering, exchange and processing for process optimization. Interestingly, Cloud Robotics and 5G APIs can be opened to end-users and third-parties to develop, program and provide any type of related service or application for pursuing specific tasks. In industry, this will pave the way to process automation, data exchange and robotics manufacturing technologies (e.g., Industry 4.0). It is likely that we will soon see robotic applications in the domestic environment: it is estimated that by 2050-2060 one third of European people will be over 65. The cost of the combined pension and health care system could be close to 29% of the European GDP. Remotely controlled and operated robots will enable remote medical/supportive care and open up a new world of domestic applications which may also be incorporated by the entire population (e.g. cleaning, cooking, playing, communicating, etc.).

5G will have a big impact also on the automotive and transportation markets. Nevertheless there are still open issues. In fact, even if significant progresses have been made in developing self-driving/autonomous machines, equipped with sensors, actuators and ICT capabilities, the achievement of very low reaction times still represent an open challenge. As a matter of fact, the autonomous driving in real traffic is a very challenging problem: reaction time in units of milliseconds, or even less, are needed for safety reasons to avoid sudden and unpredictable obstacles. This means that a considerable amount of computing and storage power must be always available through ultra-low latency links. Today, the amount of computing and storage power that can be equipped locally in a machine/vehicle is not enough (for several reasons, e.g., space, dissipation limits, costs restraints, etc.) to cope with these requirements. Huge amounts of data needs to be stored and accessed and the AI methods have to be executed very quickly to exploit such levels of reactive autonomy. An ultra-low latency 5G network will allow exploiting the best balance of resources in the Cloud and Edge Computing systems, thus offering trade-offs between a local vs global cognition execution, essential to minimize reaction times.

In a similar direction, images/video real-time processing, for example for recognizing forms, faces or even emotions in photos or live-streamed video, represents another challenging case study for AI in 5G infrastructures. In fact, this could be radically improved from the distributed execution of deep learning solutions in a 5G infrastructure capable of providing ultra-low latency connectivity links.  Also in this case, performances will be improved by the flexibility of 5G in dynamically allocating/moving either huge data sets and software tasks/service where/when it is more effective to have them.

Another example is Immersive Communications, which refers to a paradigm going beyond the “commoditization” of current communication means (e.g., voice, messaging, social media, etc.). Immersive Communications will be enabled by new advanced technologies of social communication interactions, for example through artificially intelligent avatars, cognitive robot-human interfaces, etc. Eventually, the term X-as-a-Service will refer to the possibility of providing (anytime and anywhere) wider and wider sets of 5G services by means of anything from machines to smart things, from robots to toys, etc. If today we are already linking our minds with laptops, tablets, smartphones, wearable devices, and avatars, in the future we will see enhanced forms of interactions between humans, intelligent machines and software processes.

So it is argued that current socio-economic drivers and ICT trends are already bringing to a convergence Computer Science, Telecommunications and AI.

In this profound transformation, mathematics will be the language, computation will be about running that language (coded in software), storage will be about saving this encoded information, and, eventually, the network will be creating relationships – at almost zero latency -- between these sets of functions. This trend will also see the rise of the so-called Networked AI with humans-in-the-loop. Today there are already some examples, such as analyst-in-the-loop security systems, which combine human experts’ intuition with machine learning capable of predicting infrastructure cyber-attacks.

Although security and privacy are out of the scope of this work (focusing on 5G enabling capabilities), these two strategic areas deserve some further considerations. On one side 5G could provide the means for improving security, for example as information will be available everywhere and the context needed to detect anomalous behavior will be more easily provided; nevertheless on the other side, enabling technologies such as SDN and NFV have the potential to create situations where all primary personal data and information is held and controlled at a global level, even outside the national jurisdiction of individual citizens. It has been mentioned, as an example, the real-time processing of several thousands of images per second and live-streamed video: this will have wide-ranging, but also controversial applications: from predicting crimes, terrorist acts and social upheaval to law enforcement and psychological analysis. Eventually, in the long term, this might transform everything from policing to the way people interact every day with banks, stores, and transportation services: this will have huge security and privacy implications.

Reasonably privacy and security concerns should be considered by-design, with  systemic solutions capable of operating at different levels in future 5G infrastructures: for example, such design will need to consider issues such as automated mutual authentication, isolation, data access and management of multiple virtual network slices coexisting onto the same 5G infrastructure.

16 May 2017

Operating Systems for Cognitive Cities

Idea of exploiting a sort of Operating System for Smart Cities is not new, today. It's a few years that some Cities are developing and experimenting it. Just to mention some examples, there are the brilliant experiences of Bristol and Barcelona with the so-called CityOS.

We know that in Computing systems, the adoption of an Operating Systems facilitated applications development and diffusion by providing controlled access to high-level abstractions for the hardware resources (e.g., memory, storage, communication) and information (e.g., files, directories). Similarly, in a Smart City, one may imagine a sort of Operating System facilitating City's applications and services development by providing controlled access to high-level abstractions of the City resources.

In general, a City Operating System will allow:
  • collecting and sharing data in a city;
  • elaborating said data and inferring decisions (actuation) along multiple actuators, devices, smart things to communicate, control and optimize city’s processes, etc…;
  • providing any sort of ICT services for a City.

In other words a City Operating System will allow:
  • a sensing, collecting and storing (even locally) massive data sets (through terminals, smart things, intelligent machines);
  • transporting quickly huge sets of data (through high bandwidth and ultra-low low latency network connections) where it is more convenient (allocation of virtual functions);
  • elaborating big data (with A.I. and Cognitive methods in Cloud and Edge/Fog Computing) to infer decisions for actuating/controlling local actions

so it will introduce cognitive “control loops” into the City, creating a sort of Nervous System for it ! That's why I like to call the cities of the future, Cognitive Cities.

Obviously a Cognitive City OS will include some of the corresponding functions/capabilities, which are typical in an Operating System, but referring specifically to the resources and services of a City... and my take is that A.I. will be everywhere around us, fundamental to help taming the cyber-security risks.



In fact, up today, we are using quoting the well-known sentence "Software is eating the World", but looking ahead it will be more "Cognition will optimise the World" ! 

Take a look here link

02 May 2017

Technology evolution as a collective phenomenon

Today we are witnessing a growing interest on Artificial Intelligence methods and systems (from the terminals to the Network nodes to the Clouds), about on exploiting cognition capabilities into robots or autonomous vehicles, self-learning avatars, autonomic bots, etc... Even we are looking at a sort of Nervous System for the overall Digital Society and Economy (see my previous posts). It looks like we are pursing the embodiment of the "cognition and autonomic" paradigms into Telecommunications and ICT.

In this avenue I believe we need leveraging much more than we are doing on Biology, Neuroscience, Analytical Psychology and all those efforts which are targeting the understanding of Biological Intelligence; not also, also we need to leverage on Physics for the deeper physical phenomena governing the cognition. Some solutions are already there, and maybe we need just applying them to a specific new context. I could mention several of them.

The theory of F. Varela is one example of the still very popular approach to understand the roots of cognition in very simple living entities (see “The Embodied Mind: Cognitive Science and Human Experience”, Cambridge, MA: MIT Press) 

The theory argues that adaptive behaviour of simple living system (e.g., an ant or a bee) is based on two interrelated points: 1) perception consisting of perceptually guided action and 2) cognitive structures, emerging from the recurrent sensori-motor patterns, enabling action to be perceptually guided. In particular, during their life, the living systems cross several and diverse cognitive domains (called micro-worlds) which are generated from their (local and maybe also non-local) interactions with the external environment: within a micro-world the behaviour is determined by pre-defined sensori-motor loops, very simple and very fast and automatic; from time to time breakdowns occur which are unexpected disruptive situations determining the need to change from a cognitive domain (i.e. from a micro-world) to another one. Importantly, this bridging (during breakdowns) is assured by the “intelligence” of the nervous system (allowing a new adaptation and the consequent learning of new sensorimotor loops). So, within a certain micro cognitive domain, the behaviour is directed by a set of sensory-motor loops, which are fast and performing sort of well trained, automatic reactions to the local situation. When a breakdown occurs, which is an unexpected event, the nervous system reacts developing a set of possible alternative reactions. During these trail-and-error phases, eventually a specific sensory-loop prevails which allow reacting properly to the unexpected event. So, the living system, entering this new micro cognitive domain, has learnt a new sensory-motor loop, and so on. 

Example: imagine a termite bringing some food into the nest (a sensory-motor loop) when suddenly a collapse of a gallery happens: this is a breakdown. Termite should enter a new cognitive micro world to try overcoming the obstacle. A new sensory-motor loop is developed and learnt (how overcoming a collapsed gallery). They say that the connections between these systems, micro cognitive domains, happen through a sort of overall structural coupling with the overall environment (the colony), through  a sort “field”, i.e. space-time gradient of electromagnetic fields (and potential), sounds, intertwined with tactual and metabolic information. These gradients are triggering the overall collective reactions, e.g. in terms of alignment of micro cognitive worlds. This is collective intelligence, this is how Nature's technologies works in a sustainable way.

This cognition model (balancing local vs global cognition) could be perfectly applied to develop swarms of robots or drones in Industry 4.0 scenarios.

More in general I would argue that technology evolution, for being sustainable, should be seen as a collective phenomenon ! Have a look at this amazing paper: N. Goldenfeld “Life is physics: evolution as a collective phenomenon far from equilibrium”:


27 April 2017

A.I.: What's next ? Biological intelligence (B.I.)

Biological intelligence concerns all the control and adaptive systems that are not artefacts, but rather that are exploited by Nature in living entities after millions of years of evolution.

Normally when we think about Biological Intelligence we refer to human brains and nervous system functions, but there is much more in Nature. Think about the collective intelligence in colony species like ants, bees capable of adapting and co-evolving as ecosystems in changing environment. These colonies - as our organs! - are complex adaptive systems, open as exchanging matter, energy and information with the external environment. This Biological Intelligence is self-organizing. 

Biological Intelligence is, obviously, much beyond our most advanced thinking of Artificial Intelligence (A.I.), today. A.I., in most cases, is still based on heuristics and algorithms (e.g., ML, DL, neural networks, etc), using binary logic but, above all, it is reductionist. Biological Intelligence leverages on the deeper quantum phenomena which are at the most basic level of life: binary logic is very different from the tangled interactions in quantum mechanics.

In A.I. avenues we are making outstanding progresses and we have great visions how to make a biz out of that! For example, two amazing projects have been announced last week aiming at progressing A.I.: Facebook’s plan to develop a non-invasive brain-computer interface that will let you type at 100 words per minute and Elon Musks’ proposal that we become superhuman cyborgs to deal with superintelligent AI. 

Also, a few days ago Apple suggested at a TED 2017 conference that "instead of replacing humans with robots, artificial intelligence should be used to give us super-human abilities”. 

No doubts that high bandwidth/low latency connectivity + massive A.I. (Cloud/Edge/Fog) + B.C.I (or similar advanced interfaces for humans) are likely to bring us to the next big Internet, which far reaching socio-economic implications...but beyond that there is a much more challenging and impactful frontier, for us, which is understanding Biological Intelligence and as such life. 

In fact, this implies looking at more subtle biological processes and interactions paradigms, maybe less familiar in Computer Science but surely nearer Quantum Biology.

Capturing the essence of Biological intelligence is the biggest bet we can make !

13 April 2017

A Network Operating System for becoming a "Differentiator"

Let’s now go a little bit beyond the plain metaphor of  future telecommunications infrastructures as nervous systems of the Digital Society and Economy.

At the beginning of this year, AT&T announced Network 3.0 Indigo as the next step after Network Domain 2.0: AT&T major push is now to create a trusted environment where organizations can share data and collaborate on analytics. It’s about moving, strategically, from playing the role of Architect to Differentiator to be very compete on data-powered services.

Let’s see an example, quoting the text on the link: “Imagine this simple example. A city has an electric utility, an internet company and a major heating/air conditioning repair company. They join a technician dispatch community to share their data – such as vehicle, traffic and appointment data. Through cooperative machine-learning and the broader data set, they get better and better at timing their dispatches. Their work becomes more efficient and customers are happier. And they are still able to keep their proprietary information safe”.

That brings to mind the concept of a Smart City as an organisms with a Nervous Systems! As a matter of fact, there are remarkable examples such as the CityOS of Bristol or Barcelona. We may reinforce this concept also by what is mentioned here  Indigo is yet another step in the network transformation of AT&T, and that network operating system will play a role to deliver the data-powered services that make all the rest of the work worthwhile".


So we can see here how network operating systems of future telecommunications infrastructures will eventually embed the features of a nervous system for the Digital Society and Economy. And also that the Operators’ infrastructures are the nerves and neurons of such a nervous system…including their processing capabilities: having said that, it’s clear the network operating system is not only about management, control and orchestration but it’s the most powerful instrument for implementing the biz strategies and novel forms of competitive advantages.

Quoting J. Doyle, for being successful, this network operating system should be robust yet fragile .



04 April 2017

Telecoms infrastructures will be the Nervous System of the Digital Society and Economy: what a better use case !

If you consider the metaphor of Nervous System you can soon realize the future Telecommunications infrastructures (e.g., 5G), Cloud-Edge and Fog Computing, Artificial Intelligence and IoT (i.e., pervasive sensoring and actuating) will have naturally to converge quite soon to exploit the Nervous System of the Digital Society and Economy.

The nervous system of a living entity, in fact, is a complex network of nerves and cells that carry messages to and from the brain and spinal cord to various parts of the body. The nervous system includes both the Central nervous system and Peripheral nervous system: the former is made up of the brain and spinal cord (tube like structure which extends from the brain) and the latter of the Somatic and the Autonomic nervous systems. The somatic nervous system consists of peripheral nerve fibers that pick up sensory information or sensations from the peripheral or distant organs (those away from the brain like limbs) and carry them to the central nervous system. The Autonomic nervous systems controls the nerves of the inner organs of the body on which humans have no conscious control ( it includes the heartbeat, digestion, breathing).

These well-known simple definitions are enough to understand the metaphor about the evolution of Telecommunications infrastructures (e.g., 5G), Cloud-Edge and Fog Computing, Artificial Intelligence and IoT. And also the use-case is very clear! The nervous system of a living entity allows the survivability of a living entity enabling functions as to perceive, comprehend, and adapt to the world around us; moreover the nervous system operates the body’s essential physiologic functions, such as breathing and digestion… In the same way the Telecommunications infrastructures will become the Nervous System of the Digital Society and Economy allowing their sustainability by supporting functions as to perceive, comprehend, and adapt to the political and socio-economic environments; moreover it will operates the essential physiologic management and control processes of the Digital Society and Economy. As it is accepted generally now that Nervous System can learn, rearrange, and adapt, a process often referred to as neuroplasticity, so will do Telecommunications infrastructures running A.I. on Cloud-Edge and Fog Computing resources and collecting sensing information from the IoT devices. Virtualization (SDN and NFV) will provide such neuroplasticity to the Telecommunications Nervous System.

So, rather than talking about of CityBrains for SmartCities, it makes more sense referring to global Nervous System (which is much more than a brain) of the Digital Society and Economy.


Eventually my personal take on future Telecommunications infrastructures (e.g., 5G), Cloud-Edge and Fog Computing, Artificial Intelligence and IoT…is that they should be considered systemically as concurring to create an ecosystem, much beyond segmented sectors or verticals. And the biz cases is horizontal: bringing to digital life our Society.

14 February 2017

A.I. detecting early-warning signals …in the complexity of Digital Society and Economy

The Digital Society and Economy are literally becoming complex systems. All sub-systems, at the different levels, are hyper-connected with non linear relationships. Sudden regime/phase transition can occur radically changing the scenarios
Can we early detect tipping points of such sudden regime/phase transition ?

Predicting such tipping points before they are reached is quite difficult, but it might have a huge impact in several fields, from medicine to business, from biology to meteorology to social networking, from management to business and to cyber-security.

There are some nice research papers providing some guidelines. This is one of them “Early-warning signals for critical transitions”. In this case, the paper is suggesting the analysis of generic early-warning signals indicating, for a wide class of complex systems, the approaching of a critical threshold, where small forces can cause major changes in the state. Examples of such transitions might include the collapse of over-harvested ecosystems, climatic changes, or stocks markets dynamics.

For example one symptom is the critical slowing down: when the system approaches a critical transition, it becomes increasingly slow in recovering from small perturbations (which is translated mathematically into an increase in the autocorrelation and variance of the fluctuations). Another signal that can be seen in the vicinity of a catastrophic transition point is flickering. Stochastically, the system moves back and forth between the basins of attraction of two alternative attractors (bistable region). Spatial patterns is a third example: an ecosystem may show a predictable sequence of self-organized spatial patterns as it approach a critical transition (e.g. a semi-arid vegetation to increasing dryness of the climate).

Another recommended reading is this one:


Big Data analysis by A.I. systems could make a breakthrough in this promising area of research and innovation, also on the path towards a sustainable 5G.
The potential gains of investing in these studies are formidable.

24 January 2017

If there will be a 6G...it will be Quantum !

Today 5G is under the spot worldwide. The more we look at it the more we realise that the 5G infrastructure be a radical evolution of 4G/LTE and fixed networks. In fact, current trends are showing that 5G will not be just an increase of radio bandwidth, better performance and improved reliability: 5G will deeply integrate processing, storage and (fixed-mobile) network capabilities in highly flexible and programmable architectures, almost fully "automated".

As a matter of fact technology advances and the costs reductions are bringing these capabilities pervasively into the daily reality, around us, impacting deeply any segment of Society and Economy. Not only humans, but also machines, robots, drones, pieces of software processes will become the future Users of 5G in a newly developed Digital Society and the Digital Economy. 

In this sense, 5G is expression of digital transformation, starting to materialise from 2020, and maybe, one of the major issues still open is understanding the new value-chain making all of that sustainable !

But beyond that, "what's next" ?  My take is that if there will be a 6G, then it will be Quantum !

There are several evidences of increasing efforts and investments in R&D and Innovation of Quantum systems. Some notable example are: Microsoft, IBM, HP, Toshiba, Google, NASA, Intel, Alibaba, BT and other several Centres of Excellence. 

Take a look at these links:

It's true that Quantum technologies and approaches are showing different levels of maturity, but it is already widely believed that first Quantum Systems will be available within five-ten years: as a matter of fact advanced prototypes for quantum computing and communications are already available.

There is also already a Quantum Manifesto calling upon Member States and the European Commission to launch a €1 billion Flagship-scale Initiative in Quantum Technology, preparing for a start in 2018 within the European H2020 research and innovation framework programme

A future breakthrough in the development of quantum systems at affordable prices will have systemic and far reaching impacts, e.g.,
  • the exploitation of the Quantum Internet capable of exchanging information through fully optical networks and processing it, optically, in the form of encoded photons;

  • the development of disruptive applications in the areas of cryptography, cyber-security and anti-counterfeit transactions with “quantum money”, finance, but also in bioinformatics, quantum machine learning and artificial intelligence;

  • radical implications in other sectors and industries, such as new faster ways of processing genetic big data, quantum biology and medicine or developing of new nano-tech smart materials.
Would you bet on Quantum ?




10 January 2017

Softwarization for a Hyper-connected World: en route to 5G

The IEEE Institute ranked the white paper “Towards 5G Software-Defined Ecosystems” among the 10 Most Popular Articles of 2016 (ranked #5, along with articles on Shannon Centennial and Marie Curie’s involvement in WWI).

http://theinstitute.ieee.org/ieee-roundup/blogs/blog/the-institutes-10-most-popular-articles-of-2016

Let's meet in Bologna, at IEEE NetSoft 2017 to move the next steps.

Software-Defined Networks (SDN), Network Function Virtualization (NFV) and Cloud-Edge-Fog Computing are key ingredients of an overall techno-economic transformation trend, which is impacting deeply Telecom and ICT industries. This trend, often called “Softwarization”, will bring costs optimizations and new service paradigms.

In particular, SDN, NFV and network programmability are going to become the main enablers of the 5th Generation (5G) of infrastructures, which will span from high data rate fixed-mobile services to the Internet of Things.

This timely flagship conference of IEEE SDN will shed light on the fundamental technology components and systems for SDN-NFV infrastructures, clouds-edges and any sort of network services in order to fully exploit its potential for the efficiently handling of heterogeneous resources across wire and wireless networks and datacenter domains and for easy and fast deployment of new ICT services. 

The IEEE NetSoft will bring together academia and industry to jointly review and ponder maturing developments related to all aspects of Softwarization, and its first exploitation with the 5G.

Don't miss the opportunity of joining us in Bologna, and "influencing" the way towards 5G.
There is still time to registers you papers and demos !


for questions and further information: antonio.manzalini@telecomitalia.it