30 June 2015

Softwarization Anthropology (cont’d)

Who can’t remember the amazing movie “2001: A Space Odyssey”. Movie starts with early hominids realizing how to use a bone as a tool. In the movie, millions of years later space technology and artificial intelligence (HAL) have become the new tools brining humans to a challenging voyage to Jupiter



The way humans make and use tools is what determine our species evolution, perhaps making it unique.

Well, softwarization can be seen as the new tool for humanity, offering new ways for communicating, producing-consuming ICT services, creating the Digital Society and Digital Economy. In fact, it will be a tool capable of distorting, reducing space-time dimensions of Digital Society in this data-centric era.

Telecommunications and ICT are making distances are shorter and shorter, and space is being morphed by the digitalization of objects, things and processes. Causes-effects latencies are being dramatically reduced, and “control loops” intertwined as in Complex Adaptive Systems. Non linearity is going to characterize the socio-economic variables of our world.

So softwarization will bring big changes, as it will allow implementing into reality our cognition models.

In fact, we’ve mentioned several time that softwarization will allow pervasively sensing and collecting massive data (by sensors, terminals, smart things, machines, robots, drones, etc); exchanging data (via fixed and mobile networks with high bandwidth and low latency); analyzing quickly big data (with Cloud-Edge Computing) in order to elaborate decisions (e.g., with AI methods, algorithms, heuristics) for actuating local actions (by any actuators)… That's how a "nervous system" works. And this will boost pervasive robotics and intelligent machines applications. The “Second Machine Age” is really coming.

I believe that, from an anthropological perspective, this is also very much resonating with what R. Bandler and J. Grinder called “distorsion”, as one of the processes which allow us to survive, grow, change…a process which allows us to make shifts in our experience of sensory data. Indeed, future Telecommunications and ICT services will allow transforming our experiences, digitalizing everything around us. 

Also  Marshall McLuhan in “Understanding Media” argues that the web, media, Telecommunications in general are bringing to a sort of social "implosion", as people and processes are more closely connected and interworking.  This unification, through implosion, for McLuhan, will allows for the idea of living in a "global village" to emerge. And emergence is a phenomenon of Complex Adaptive Systems.

But I think that this  "global village" will be also populated with digital beings such as virtual and embodied agents or software processes, becoming characters of socio-economic surroundings and life conditions. ICT is reaching such a level of maturity that it will fade into the physical reality, merging with it. So it won’t be just a matter of technology records !


Also, it will be very much important realizing how to make this "complex" merger (of the natural and softwarized worlds) really acceptable from a socio-economic perspective, by addressing issues such as the overall sustainability and the compatibility, in the respect of Nature. 

26 June 2015

Softwarization Anthropology

This post is not about technology: it’s about psychology, anthropology and sociology.

Why ? Well, digital anthropology is already there: it is the study of the relationship between humans and digital-era technology. The field has a variety of names with a variety of declinations, including techno-anthropology, digital-ethnography, cyber-anthropology, and virtual-anthropology, etc.

No need to say that “Softwarization of Telecommunications” will also intersect with various disciplines dealing with humans, including anthropology, psychology, sociology, etc.

Sometimes ago I read the amazing book “The Structure of Magic” (A Book about Language and Therapy) by the two psychologist R. Bandler and J. Grinder.

Book contains very interesting remarks about human conditions and paradoxes. Quoting the Authors: the most pervasive paradox of the human condition which we see is that the processes which allow us to survive, grow, change, and experience joy are the same processes which allow us to maintain an impoverished model of the world - our ability to manipulate symbols, that is, to create-models. We can identify three general mechanisms by which we do this: Generalization, Deletion, and Distortion.

Generalization is the process by which elements or pieces of a person's model become detached from their original experience and come to represent the entire category of which the experience is an example. Deletion is a process by which we selectively pay attention to certain dimensions of our  experience and exclude others. Distortion is the process which allows us to make shifts in our experience of sensory data. […] It is this process which has made possible all the artistic creations which we as humans have produced. A sky as represented in a painting by Van Gogh is possible only as Van Gogh was able to distort his perception of the time-place in which he was located at the moment of creation. […] Similarly, all the great novels, all the revolutionary discoveries of the sciences involve the ability to distort and misrepresent present reality.


Softwarization is like a tool that will offer humans a potential new world of communications and services: I believe that the creation of a new socio-economic development will depend very much on our ability to manipulate symbols, that is, to create-models.

This is call for psychologist, sociologists and anthropologists to join our initiative !
    

15 June 2015

Two Innovation Cycles for Softwarization

SDN and NFV principles are going to impact not only the evolution of current networks, but also the services and applications platforms. It would be very limitative considering SDN and NFV only from the networking perspective.

In fact, Softwartization will be a systemic and impactful change of paradigm in whole Telecommunications and ICT domains, with far reaching consequences in the value-chain: it won’t be just like introducing a new transport or networking technology or new network layer (as it was for SDH, IP/MPLS, etc).

We know very well that, in the past, Telecommunications infrastructures were always built with purpose-built equipment designed for specific functions; these pieces of equipment were provided by Technology Providers as “closed boxes”, including the hardware, software and its operating system. 
When introducing a new technology or network layer, a waterfall innovation approach was normally adopted. Softwartization is a game changer. Waterfall innovation is dead.

In the future, the decoupling of software from the hardware, the virtualization of IT and network physical resources and growing availability of Open Source software will change the scenario: it will be possible to develop and manage network and service functions as “applications”, made of chains of open source software components interconnected via logical links.
The software mindset is eating the Telecommunications.

In this respect, it is likely that Softwarization will be exploited by Industry with a bimodal approach, through two innovation cycles: one relatively slow, looking at a seamless evolution of current network infrastructures towards SDN-NFV, and another one much faster, where Softwarization will pave the way to integrated network and service software platforms. A totally different scenario with respect to past times.

These two cycles will coexist bringing to the deployment of “softwarized” network and service domains, operated with IT-style processes and capable of providing specific end-to-end services. 

These “softwarized” network and service domains will emerge here and there and coexist for some time with the legacy infrastructures. A key issue will be the cross-interoperability of these “softwarized” domains, which should be pursued since the beginning - by design – i.e., by embedding interoperability features and capabilities into these horizontal platforms.

03 June 2015

An Open Fabric at the Edge

Fog Computing pushes the Cloud Computing paradigm towards the edge of current networks, leveraging on distributed processing and storage resources around Users. Apparently another buzzword: in reality the progressive maturation of a technology trend, which is about hardware miniaturization, increasing performance and costs reductions, together with pervasive ultra-broadband. This is resulting in more and more powerful devices, smart terminals, intelligent machines scattered in the environment around Users, and capable of storing data and executing services locally (or better in orchestration with the Cloud).

This floating fog of ICT resources at the edge will create the conditions whereby Users will literally “decide and drive”  future networks and services. This fog of edge devices can indeed create a sort of processing and storage fabric that can be used to execute any network function and to provide any sort of ICT services and applications. The components of this fabric can be seen as: CPU/GPU, SSD (Solid State Drive), HDD (Hard Disk Drive) and link (and this is perfectly in line with the “disaggregation of resources” targeted by the Open Compute Project). One may imagine these components aggregating dynamically in an application-driven “flocking”. And, in the same way as birds with simple local behaviors are optimizing the aerodynamics of the flock (which is solving a “constraints optimization problems” by using very simple local rules), the flocking of component can follow dynamically application-driven network optimizations.

So, imagine providing ICT services by orchestrating the use of local idle computing and storage resources of millions of smart terminals, nodes, machines at the edge…one may argue that not all types of services and applications can run entirely on the edge, however, there are several examples like content aggregation, transformation, data collection, analytic, static data bases, etc. which can really take benefit from the fog paradigm.


Surely, the end-to-end latency is one of the major problems to be solved.

Imagine, just for didactical reasons, to consider the equivalence between the time of one CPU cycle and the time of a step in a walk. The latency in accessing a SSM (e.g., DRAMs) can be estimated as around tenths of CPU cycle, tenths of steps in our example. But if you wish estimating the average latency in accessing HDDs, i.e. stored data (also including the latency of the network links, RTT), then, overall, it results the time of a walk of about 10 000 km !