IPv6 – your next cash cow?

For anybody looking at the next big thing, the new 'killer app' or the new gold, I recommend to read a white paper by the Atos Scientific Community called "IPv6: How Soon is Now?".

The paper explains very well the problem with the way the internet is currently working. It points out that we have a serious issue, a 'time-bomb', with the way that devices (computers, networking components and other IT stuff) are connected with each other using this old IPv4 technology. The paper further explains why, in spite of all kinds of intermediate technologies, we need to adopt a new technology, called IPv6, and we need to do that very quickly.

"To sustain the expected Internet growth, there is no adequate alternative to adopting IPv6."

Furthermore you will read in the paper that we will be running into real problems if we do not make that change and unfortunately the change is happening much too slow.

"Unfortunately statistics from Google paint a (…) picture with less than 1% of total users being IPv6 clients"

This might sound awfully boring and a field of play for the technology wizards in your organizations – this is not for you right? But wait, because halfway through the paper, the authors start explaining that the benefit of this new technology is in the way it can support all possible technical devices (including cars, phones, traffic lights, wind mills, televisions, your shoes and wrist watch, medical devices and almost anything) can become connected – can talk with each other – when we switch to IPv6.

"(…) that IPv6 can now be used on virtually any communicating object, from servers to small sensors, regardless of the underlying (…) network technology."

I think this changes everything; it opens up a whole new world of play for consumers and manufacturers, for service providers and retailers; to create new businesses, to open up new markets and create new ways of making money.

"The IPv6 "Killer App" is likely to be the enablement of the Internet of Things (IoT)"

Based on this you would be stupid to not support this move to IPv6; it will be the engine that allows your business to innovate and grow; your IT landscape will increase thousand fold and you can bring any type of information, sensor or other device into your business platform. That is cool and exciting.

But it will not be easy.

"Although many people think that a migration to IPv6 is primarily a networking issue, the truth is that all IT organizations across server, network, storage and application domains must be equally trained to contribute to both the planning and execution."

The authors explain in quite some detail that you will need to overcome technical hurdles (IP Space Management, IP Address Provisioning, IPv6 to IPv4 interoperability, Application IPv6 readiness and Security Challenges) as well as business challenges (Coordination across silos and companies, Timing issues on what to do first and governance to establish End-to-end responsibility).

"We predict a tipping point when there will be more IPv6-connected users and devices, and therefore opportunity, than the IPv4 landscape provides today."

So, want to grow your business, do the strategically right thing and set yourself up for business growth, agility and all the other stuff you need and like? Migrate to IPv6 now.


This blog post was previously published at http://blog.atos.net/blog/2013/09/03/watch-this-space-lucky-7-avoiding-information-overload/

The Data ‘Explosion’ is real

In a recently published research called Ascent Journey 2016, the Atos Scientific Community considers the massive growth in data and storage as an important trend in IT.

Whilst the concept of Big Data has been around for a number of years and is relatively well understood, it is now becoming clear that everything we do is leaving a trail of data that can be analyzed and used.

Examples include the payments we make on a credit card, the books we read on an e-reader and our energy use by driving an electric car. This will lead to a new era of Total Data that, in turn, will lead to new business models, services and economic growth.

We don't yet understand all the implications of this – for businesses and society – but organizations that are able to harness and make sense of the vast quantities of heterogeneous data from disparate sources will gain valuable insights into market trends and opportunities.

An 'Ecosystem' of new management tools is taking shape, covering the various layers of the data stack in the enterprise and capable of delivering a 'Total Data' approach.

The technology that supports the Information Management Lifecycle in the enterprise is going through a profound change, due to the emergence of new solutions, many from open source background (NoSQL databases, Hadoop, analytical tools like R, visualization tools). To enable the 'Total Data' environment, the new technologies need to connect into and partly replace traditional technologies.

  • In some scenarios, data must be obtained, processed and correlated with insights being derived and actions initiated as close to real time as possible.

Yesterday's data is not interesting unless it helps predict tomorrow. Yesterday's traffic report isn't helpful in plotting a journey today unless it is known to represent today's pattern as well, and combined with other data can improve future congestion. Pattern Based Strategy enables huge amounts of historical data to be analyzed for previously invisible patterns. These patterns give us the power to start predicting what is likely to happen in the future, so we can plan and improve, both in real-time and in non-real time scenario-planning. For example, real time predictive analysis will plot the route for transporting donor organs across a city safely and quickly, continuously adapting the route to changes in the traffic patterns as they are happening. Another example is a country to make compliance recommendations (and potentially becoming a legal requirement) to companies for maintenance regimes for their infrastructures or industry plants using analytics on historic data and thus establishing an automated "what are the lessons learned" process.

  • Everything will be digital and everything will be connected.

Everything will be captured; "your life is becoming a video" – you can even replay your actions, thoughts and analyse in various forms and for multiple purposes (see http://quantifiedself.com/ for example); this is not only becoming possible for peoples life's, but anything that can be measured can be tracked, traced and put in a digital context for analysis. The ability of businesses to process this wealth of information is still unclear. What these developments – and others related such as 3-D printers and cognitive computers that will be able to replicate smell and touch for their users – mean for society, laws and concepts such as individual privacy need to be reassessed and will prove a huge challenge to governments, businesses and individuals in the 21st Century; for example long-established laws and concepts such as individual privacy need to be reassessed.

  • After an initial confusion phase, traditional and 'Big Data' orientated approaches to analytics will converge in a unified 'Total Data' platform.

Big data relies on its sister technologies of optimized IT networks, rapid mobilization communication tools and cloud computing. Data Analytics as a Service could emerge from a combination of Big Data, Pattern Based Strategy and Cloud technologies. Business performance can improve in areas such as increased forecasting and enhanced automation capabilities and buildt new business propositions upon the discoveries they can do using Total Data as a source of undiscovered information.

[This blog post is a rewrite of http://blog.atos.net/blog/2013/02/22/the-data-explosion-is-real/?preview=true&preview_id=1555&preview_nonce=6df2f23c80 ]

How big is your robot?

What do you get when you combine cloud computing, social networking, big data and modern day engineering? You get a kick-ass robot. This was my first thought when I finished reading a published whitepaper by the Atos scientific community on the topic of robots.

Central in the paper is the question: “Where is the mind of the future robot?”, and by outlining the concept of a robot that can utilize everything that is available in cyberspace you may find it difficult to answer that question.

Today it is hard to predict where on earth all of the data about you is stored in the cloud and we have never been able to communicate more easily. It is easy to see that robots will be everywhere, able to utilize all available information.
This will lead to a new class in robot persona’s and capabilities.

Once the robot is part of a social network, it could virtually interact with humans as well and thus start truly mimicking human behavior.


When I was (much) younger we had a program on our home computer that was called ‘Eliza’. This program would behave as an electronic psychiatrist. It had some limited learning capabilities and some clever language skills to ‘trick’ you in having an actual conversation.

If you would type things like “I hate talking to a computer”, Eliza would answer with “Hate seems to be important to you, can you explain that?”

If we now multiply the capabilities of this ‘Eliza’ by a thousand or more (using cloud computing scalability) and bring in the analytics of all of your ‘likes’ or ‘diggs’ or even the behaviour of your friends, combined with knowledge about your locations and multiply that by analysing all the things you did 5 years ago, 10 years ago and today …. Well I think you get the picture.

The more a future robot knows or has access to, the more it will be able to fulfil his role in supporting us. This may not sit well with everybody, but if we utilize this capability in a clever way, I believe we can benefit.

Especially if we also take into account that a robot can take different forms, could exist virtually or maybe even be in multiple locations at the same time, with access to the right information and computing power to use that to our benefit. The whitepaper describes some of these scenarios and puts it in the perspective of the role of IT providers and systems integrators.

Based on my reading of the whitepaper I was thinking that maybe the statement ‘I cannot be in two places at the same time’ will soon become a thing of the past.



[This blog post is a repost of http://blog.atos.net/blog/2012/11/26/watch-this-space-how-big-is-your-robot/ ]


 

Ascent Journey 2016 – the future trends in IT and business explained



Ascentjourney2016buildingblockssmall
Atos just announced the publication of Ascent Journey 2016 – Enterprise Without Boundaries

"Ascent Journey 2016 is a unique and comprehensive document where Atos’ Scientific Community presents its predictions and vision for the technology that will shape business through to 2016.

It builds on Journey 2014 – Simplicity with Control and is enriched by the new challenges which have now
emerged in reshaping both society and business alike.

Our research suggests that the convergence of key issues affecting Demographics, Globalization and Economic Sustainability, underpinned by Trust, will see a new way of working emerge in which traditional barriers no longer exist, but where security and privacy are more important than ever."

Exiting stuff and I am honoured to say I was part of the editorial board who produced this document.

More information and download here.

Press release here.

 

 

The PaaS cloud computing lock-in and how to avoid it

Cloud Computing changed from choosing an easy solution, into making a difficult decision.

The reason is the proliferation of cloud offerings at all layers; today we do not only find ‘everything-as-a-service’ cloud solutions, but also ‘everything-is-tailored-for-your-specific-situation-as-a-service’ tagged as cloud solutions.

Is this good? I do not think so.

My main objection is that you will end up with a cloud solution that is no different than any solution you have previously designed and installed yourself, at a cheaper rate and lower quality SLA.

True cloud solutions should not only focus on cost reduction, increased agility and flexible capabilities. You should also be buying something that supports portability between the private and public computing domain, and across different vendor platforms.

In early cloud solutions, mainly the ones focussing on Infrastructure-as-a-service, this portability has been heavily debated (remember the ‘Open Cloud Manifesto’?) and in the end we concluded that server virtualization solved a lot of the portability issues (I am simplifying of course).

We also had Software-as-a-service and some publications showed that the portability could be addressed by looking at standardized business process definitions and data normalisation (again, I am simplifying).
Now the Atos Scientific Community has published a whitepaper that looks at the most complex form of cloud computing; Platform-as-a-service.

PaaS offerings today are diverse, but they share a vendor lock-in characteristic. As in any market for an emerging technology, there is a truly diverse array of capabilities being offered by PaaS providers, from supported programming tools (languages, frameworks, runtime environments, and databases) to various types of underlying infrastructure, even within the capabilities available for each PaaS


So a common characteristic that can be extracted of all this diversity is the fact of PaaS users currently are being bound to the specific platform they use, making the portability of their software (and data) created on top of these platforms difficult.

As a result we see a slow adoption of PaaS in the enterprise; only those groups that have a very well defined end-user group are looking at PaaS – and mostly for the wrong reason: ‘just’ cost saving through standardization.

In the Atos Scientific Community whitepaper they are identified as:

Two primary user groups which benefit from using Cloud at the Platform as a Service level: Enterprises with their own internal software development activities and ISVs interested in selling SaaS services on top of a hosted PaaS.”


The current situation where PaaS is mostly resulting in a vendor lock-in scenarios is holding back the full potential for applications on a PaaS.

By introducing a general purpose PaaS, we would allow a comprehensive, open, flexible, and interoperable solution that simplifies the process of developing, deploying, integrating, and managing applications both in public and private clouds.

Such an architecture is proposed and explained in detail in the whitepaper; it describes the desired capabilities and building blocks that need to be established and it also offers an analysis of market trends and existing solutions, in order to establish a future vision and direction for PaaS, as well as outlining the business potential of such a solution.

We can all continue to feel positive about the power and the business potential of cloud computing.

Changing your cost base from capex to opex, increasing your speed in your go-to-market strategies and the flexibility in capacity and location are very important for your business.

We should not however confuse vendor specific solutions with cloud solutions only because they promise flexibility in cost and easy deployment; being able to shift and shop around is always better – also in cloud computing.


This blog post is a repost of http://blog.atos.net/sc/2012/10/15/watch-this-space-the-paas-cloud-computing-lock-in-and-how-to-avoid-it/