25 October 2015

How Telecommunications will change

In the very beginning of Telecommunications, around 1880, the business seemed to be the sale of telephones: it should have been up to the buyer of the telephone to roll out the needed wires to connect with another telephone. But, soon it was realized that the “connectivity fabric” was the most important, and expensive part of the story. So the Network Providers started making the huge (Capex) investments for exploiting (and managing) such network infrastructures.
Telecommunication business didn’t change that much in the following 130 years.
But it will change radically in the coming few years, due to the convergence of a number of (well-known) techno-economic trajectories.
Example: already today main of the overall Telecommunication business is in the smart-phones, not in the network. The number of smart-phone being sold versus network equipment is billions against millions, with economic unbalance 70% vs 30% in favor of the terminals. This means the market is already led by smart-phones, and perhaps tomorrow by future smart terminals, such as robots, drones, any sort of autonomous machines equipped with processing, storage, communications capabilities and sensors/actuators. This does not mean that the network is no longer important, obviously: it means that it will change radically and also our perception of it. It’s the Softwarization of Telecommunications, which I started predicting that five years ago, and that now it’s really coming into reality.  

Softwarization will tranform the Telecommunications infrastructures from today networks of interconnected closed boxes (todays nodes, e.g., switches, routers, middle-boxes, etc) into a continuum of logical containers (e.g., Virtual Machines, or Dockers) executing millions of software processes, interacting each other. If it will make sense to allocate, move or change a functionality in a smart-terminal, in an edge DC, even a SME, a User or a machine will be able making it. Not only humans but also autonomous software entities will be able to produce and consume services in this continuum of ICT virtual resources.

That’s a radically different perspective for Telecommunications and ICT.
At this level, it makes a lot of sense to investigate how modeling, control and steering the dynamics of this software continuum. The mathematics behind this, in fact, may open the way to new models for a networked cognition, or even a new theory of information beyond Shannon. It will be about understanding how and why human or software processes beings assemble themselves into social networks. Just imagine dynamic logical networks where every logical node is a person or an avatar and every logical link between logical nodes is a relationship between them.

It’s about mathematical, social, biological and psychological rules that govern how these logical networks are assembled, are operated, how they will affect our lives, and the economy. That's the mine where extracting the value.

20 October 2015

Structured Information beyond Shannon…

Information permeates everything: from electrochemical information exchanged in networks of neurons, to biological information stored and processed in living cells, from the information extracted from big data, to the information available on the web…to the information we process and exchange in our daily activities…etc.

On the other hand, our current understanding of information communication is still based on Claude Shannon’s seminal work in 1948 resulting in a general mathematical theory for reliable communication in the presence of noise. Traditional information theory considers the communication from the viewpoints of channels connecting two endpoints. This approach should be enhance when considering networks (even social networks, or networks of s/w processes) with massive sources which relay information in a multi-hop manner and with time-varying logical topology. That’s fully another story.

Frederick P. Brooks, Jr., wrote in “The Great Challenges for Half Century Old Computer Science”: “Shannon performed an inestimable service by giving us a definition of Information and a metric for Information as communicated from place to place. We have no theory however that gives us a metric for the Information embodied in structure. . .”

Even more today, technology acceleration (e.g., in ultra-broadband diffusion, IT performances and miniaturization, systemic softwarization, etc) is calling for enhancing this model of information, especially when considering the near advent of highly pervasive ICT fabrics with massive s/w processes which relay information up to the terminals and things. Example: it has been shown that the theoretical capacity of a multi-hop network is proportional to the square root of the network size (number of nodes). This  promises enormous capacity for ultra-dense network fabrics! A breakthrough is possible here, especially when thinking about soft-RAN.

But it’s much more than that. This reasoning, abstracted, is also applicable to derive new cognitive models, whose base is – by definition – extracting and elaborating information, new forms of social interactions (even with/between Avatars) or new way for humans to interact with autonomous machines or environments.

As an example of ongoing activities in this field, National Science Foundation has established some time ago the Science and Technology Center for Science of Information to advance science and technology through a new quantitative understanding of the representation, communication and processing of information in biological, physical, social and engineering systems. The center is located at Purdue University (Partners include: Berkeley, MIT, Princeton, Stanford, UIUC, UCSD and Bryn Mawr & Howard U. ).


My take is that also for Network and Service Providers have to shift from the current “one way value proposition” where value stands mainly in the connectivity, to the "structured information value proposition", information generated through multiple interactions of humans and/or machines.

18 October 2015

How many Economies ?!

Recently we are witnessing a great “creativity” in coining economy definitions and models. Examples are: Sharing economy, Circular Economy, Access Economy, Data driven Economy, Digital Economy, Information Economy, API Economy, Platforms Economy, Creative Economy… and many others.

Well, maybe we can add here another one, another buzzword: the Softwarization Economy where "software is eating the world" ! But this is not the key point.

This looks like the story of the blind men and the elephant, a tale coming from India. A group of blind men touch an elephant to learn what it is like. Each one feels a different part, but only one part. They then compare notes and argue that they are talking about different things. But this not the case.

That’s the same for the mentioned definitions of economies: distinctions without differences, again.

My take is that it’s a simpler story: just look at telecommunications and ICT which are going to become accessible at low cost to Users and enterprises in any part of the world, almost on an equal basis. As a result of the consequent mass digitalization, it is becoming possible (through horizontal platforms and marketplaces) to optimize the use/share of any resources, and even to automate any process of the Society and the Economy. Simply as humans, we have a new "tool" to adapt and to change the environment.

This is the elephant, at the end of the day: an hyper-connected world where telecommunications and ICT are so mature to “morph” the space-time dimensions of reality by automating and optimising any process. All the above X-economy are expressions of this same transformation.

And out of this transformation, robots and autonomous machines are likely to replace humans in several jobs and professions. This, in fact, will bring optimizations and costs reductions. The most expensive costs in a production environment are coming from humans.

So, it is easy to forecast this unstoppable transformation: unavoidably, any large scale ecosystem sooner or later will move from a state with higher energy (i.e., higher cost) to a state with lower energy (i.e., lower cost); local minima are usually related to the stable stationary states.

It’s adaptation by solving constrained optimization problems, as in Nature.
The law driving the electrons current in a circuit and steering the emergent behaviors of a beehive.

The obvious question is: what’s left to humans in the future ?
A point of discontinuity is ahead of us: the Quantum leap.

17 October 2015

Big (Actionable) Data for Big Networks

Telecommunication and ICT infrastructures are becoming more and more pervasive, hyper-connected and dynamic. Internet of Things, Machine to Machine, advances Robotics, A.I. applications are posing very challenging requirements.

In fact, heterogeneity of nodes, devices and terminals are thus creating a complexity in management and control with is becoming more and more challenging due to the scalability, dependability and real-time action-reaction requirements.  

Industry, as such, is looking with interest at innovative “service oriented” approaches to management-control and orchestration, where infrastructure resources are virtualised and services are becoming sort of “units of orchestration”. Think about the "X-OS".

It’s the transition from a situation where Network Providers are managing-controlling pieces of equipment towards the ability of managing-controlling virtual resources or containers, and orchestrating the various aspects of services.

When automated, this will simplify dramatically the Operational processes, reducing mistakes and delays. It's well known, in fact, the that time-to-market requirements are very critical: any delay in configuring resources and services directly affects deployment and in turn has an impact on savings and revenues.

Some time ago, IETF has released NETCONF and YANG which are standards addressing configuration management: NETCONF is a configuration management protocol for transactions and dedicated configuration operations; YANG is a data modeling language used to model configuration and state data manipulated by NETCONF. YANG can be mapped to a NETCONF XML representation easily.

In other words, interestingly, YANG can be used for configuring both services and the network functions: at the end of the day it’s data, actionable data. This is not new, obliviously, but what makes it very attractive is the vision of the network and service platforms as a "field" of actionable data.


Future management-control and orchestration will be more and more about processing this continuous "field" of actionable data in real time: the "power of events".

10 October 2015

Growing Softwarization in Sandboxes…

Disruptive innovation is unlikely to be welcome by everybody, since it changes the rules of the game, it requires new processes, it often displaces a number of established Players, it reduces the threshold for New Comers to enter, eventually changing the value chain.

So, disruptive innovation always implies a deep change of culture, requiring adaptation. This is the case for the disruptive wave of innovation being brought by Softwarization: it’s not another network layer, or an overlay, it’s a radical change in exploiting and operating Telecommunications and ICT.

In my personal opinion, disruptive innovations, or better transformations like this one, are likely to be effective managed only if grown in “sandboxes”. It makes little to no sense “disrupting” a legacy infrastructure with Softwarization: it’s not effective. It’s not about making an “overlay” for a simple reason: the processes.

There will be always another Player moving faster exploiting such innovation with a simpler and lower cost infrastructure, with lighter and faster processes. The rules of the competition will be changed by Softwarization, that’s clear.

The real disruption is not about decoupling hardware from software or virtualizing functions: it’s about the processes and, as such, about the collective culture of the Industries. Faster and automated processes should be capable to handle in real-time millions of software transactions, even made by non-human Users.

Obviously, “sandboxes” will have to coexist with legacy for some time, up to when markets decide them to be spreading like wildfire. How ? See proposals in my next posts.


Join us (free registration at http://sdn.ieee.org/) to make open innovation a powerful instrument for contributing to the development of a sustainable and inclusive Digital Economy, to improve quality of life and to create new jobs for the future. 


08 October 2015

Big Data? Yes, please but “Actionable”

Data acquire value not only if they are big, but also if they are “actionable”.

Actionable means ready for being used for optimizing processes, whatever processes they are.

It’s not a matter of data quantity, but rather quality, accuracy, persistency…and other characteristics making them “actionable”, which means making them valid and representative for a particular context or situation and as such adequate input for making a decision.

Actionable data are food for “cognition”, i.e., A.I., analytics methods, algorithms, heuristics…those Industrial Mathematics instruments (implemented in software) necessary for inferring decisions (using the enormous IT power now available at low costs).Then these decisions will be used actuating actions which, in turn, are aiming at optimizing of processes, for example.

As a matter of fact, Real Time Operations of SDN-NFV future infrastructure will strongly rely on actionable data in order to automate and optimize the Operations processes. Again this is like closing cognition loops, implemented with software, starting by actionable data. This is for DevOps as well.

So I'd like quoting here Marc Andreessen “Software is eating the World” once more! And it's evident more and more evident that the software is the new "tool" of the Digital Homo Sapiens...


...that's why, yesterday, when concluding my talk at EIT Digital event “Towards a data-driven economy” (@EU Pavilion EXPO2015) I argued that with Actionable Data, Cognition will eat the World !