30 April 2016

Blockchain: a game changer also for security ?


Security is one of the biggest challenges of “Softwarization”. The expected high levels of programmability of the future Telecom infrastructure will allow creating and customizing dynamically “slices” of virtual resources spanning from the Terminals through the Network up to the Cloud Computing. 

In order to achieve the desired levels of dynamism and openness of the future service ecosystems (running upon these virtualized Telecom infrastructures), whilst ensuring security, the control, management and orchestration processes require to be highly secure and trusted (as there will be interfaces exposed to public access).

A way to achieve that is ensuring that any control, management and orchestration command, transaction will have to be properly traced (associated to the “reason why”) and eventually authenticated.

The “infrastructure slice” will have to become a sort of trusted entity, operated with automated processes. Message authentication in Network and IT is not a new story. 

But, what might create a difference, today, is the applicability of “blockchain technology” for the future security requirements: just think about the authentication of Users (humans and/or agents), but also any commands and messages, or resources belonging to other domains, etc. 


A block chain is a distributed database that maintains a continuously-growing list of data records hardened against tampering and revision. The blockchain is seen as the main technological innovation of Bitcoin, since it stands as proof of all the transactions on the network.

A block is the ‘current’ part of a blockchain which records some or all of the recent transactions, and once completed goes into the blockchain as permanent database. Each time a block gets completed, a new block is generated. The blockchain database is shared by all nodes participating in a system.

Will blockchain be a game changer also in this direction of transforming "network slices" or even "service chains" in trusted entities ?

It should be mentioned that blockchain transaction recording require time spans that might be too long for keeping track of all potential interactions One should consider limiting its use to those transaction which don’t have real time contraints. The other problem is scalability, as well.

There is nice white paper at this link:

...in any case it poses interesting "uberization" perspectives...



27 April 2016

Towards the Quantum Singularity

The rapid development of innovative technologies (spintronics to mention one) and new materials (not only based on silicon but also carbon) allows manipulating - in a controlled way - a variety of quantum entities (e.g. electrons, photons and "quasi-particles" as phonons, anions, etc.), thus making feasible the ability to process and store data at the quantum level.

Some activities on quantum communications in some standardization bodies (e.g. ETSI and IEEE) and various movements of the market (also related to the creation and acquisition of start-up) are showing the growing level attention for this future avenue from both vendors and service providers (e.g. Microsoft, IBM, HP, Toshiba, Google, NASA, Intel, Alibaba, BT ...). As another example, in this Press Release of European Commission https://ec.europa.eu/digital-single-market/en/news/european-cloud-initiative-give-europe-global-lead-data-driven-economy there is a plan by 2018 of launching a flagship-type initiative to accelerate the nascent development of quantum technologies, which is the basis for the next generation of supercomputers.


Specifically a strong interest is emerging in the use of quantum technologies for strengthening of Artificial Intelligence (AI) and Cognitive Computing; Laboratory Quantum AI Google has already demonstrated the applicability of the D-Wave Quantum Computer 2X in solving complex optimization problems with constraints. IBM also is considering the expansion of Watson's capabilities with systems based on quantum technologies.


But there are also other interesting application contexts: for example, cybersecurity (not only for the transmission of aspects, but the potential for "speed" with which quantum systems may violate the existing security codes), and development of new methods of processing of Big Data, also suitable for applications in genetics, biology and medicine.

Among experts there is a growing belief that a "breakthrough" in the development of quantum systems (now conceivable within 10 years) would have a deep socio-economic impact, and could probably open a new CAPEX type industrial cycle: in fact, if the "softwarization" is transforming the telecommunications services in "commodity", the levels of investment required to develop services for the Quantum Society would be huge.

10 April 2016

Where is the value moving ?

In the coming age of "Commoditization" of hardware, when "Software will eat the world", where is the value moving ? 

My guess is towards"Applied Mathematics" and "Artificial Intelligence", at least in the medium term.

In fact, the value of the so-called "Platform Economy" doesn't stay in the hardware, neither in the software per se, but in the "computational intelligence", i.e., in the algorithms, the heuristics, in other words, in all those methods of the Applied Mathematics and Artificial Intelligence capable of implementing sharing functions, optimisations capabilities, learning and, in general, all those features making said platforms services appealing for the market.

Then, in view of the "Softwarization" of Telecommunications, the next question might be: in this rush towards 5G, what are the techno-economic models/approaches "good enough" to make Telecommunications and ICT businesses sustainable ?

As a matter of fact, ecosystems are changing radically, the value is moving up, and new form of business are going to emerge, more and more driven by "Applied Mathematics" and "Artificial Intelligence".


The so-called "Over the Top" are already there, but not on the "Top" as believed! They are at the very Edge, i.e., in the Terminals and Users' Equipment (whoever will be the Users, even machines).

04 April 2016

Softwarization: a matter of mathematical thinking. And beyond ?

It is often argued that “Softwarization” (and Hardware commoditization) combined with the “algorithmic revolution” will create the conditions  for the development of the so-called “platform economy”.

As a matter of fact, what we are witnessing is a progressing transformation of telecommunications infrastructures into “software-based fabrics” of (almost) standard hardware capable of enabling verticals platforms of platforms (e.g., providing services for IoT, Robotics, Industry4.0, Tactile Internet, etc.). With this plastic “software-based fabrics” it is expected it will be possible to chain network functions and services, to execute and orchestrated them on virtual resources, from the Cloud to the Fog.

On the other hand, deep inside in this “software-based fabrics” there will be a “fabric of algorithms”, so the core of this transformation will be mathematics. Software is obviously the most used instrument to implement said mathematics, today.

So, mathematics will be language, computation will be about executing said language (coded in software), storage will be about saving the related exchanged information and, eventually and networking will be about creating relationships between, combining – at almost zero latency - said sets of functions and services.

This morning I stumbled upon this very interesting paper arxiv.org/abs/1603.06371, The Classical Origin of Modern Mathematics.

The aim of this paper is to study the historical evolution of mathematical thinking and its spatial spreading.  They have identified two important transitions in the 20th century.  A first occurred between 1930 and 1940, when the disciplines of statistics and probability merged and began to attract other applied fields, such as information theory, game theory, and statistical mechanics. The result was the emergence of the field of applied mathematics. The second transition occurred between 1970 and 1980, when computer science and statistics merged to form one community (see this link).

That’s amazing. It appear that mathematical evolution is not made of smooth transitions, but instead, it is a maelstrom, characterized by tipping points, I’d say like “phase transitions” in complex systems.

That’s a law of Nature. My guess is that the next transition is about occurring in front of us. Current technology drivers (e.g., what’s behind Softwarization) are steering computer science, Artificial Intelligence and Applied Mathematics for networks to merge into a new community. This will have far reaching socio-economic implications for the Digital Society and Economy !

Beyond that, the next frontier, will be realizing that human mind is not at all “algorithmic” (by the way as remarkably argued by K. Godel), and that will be a much more impactful revolution, based on a new theory of information !