Curiosity drives cloud computing

I like asking questions and I like getting good answers even better. It is because of that, I now have a love / hate relationship with search engines. Most of the time they give me a 50% answer, a kind of direction, a suggestion, a kind of coaching to the real answer. It is like the joke about the consultant; “the right answer must be in there somewhere, because he or she gives me so many responses”.

PH03797IIn spite of all kind of promises, search engines have not really increased their intelligence. Complex questions with multiple variables are still nearly impossible to get answered and the suggestions to improve my question are mostly about my spelling or because the search engine would have liked a different subject to be questioned on.

So nothing really good is coming from search engines then? Well most arguably search engines have brought us cloud computing and a very powerful access to lots and lots and lots of data, otherwise known as ‘the world wide web’.

No wonder I envision that powerful access and cloud computing are the two most important values we want to keep while increasing the capacity and intelligence to do real analytics on large data sets.

In a whitepaper of the Atos Scientific Community, these 2 elements are explored in great depth:

  • Data Analytics needs cloud computing to create an “Analytics as a Service” – model because that model addresses in the best way how people and organizations want to use analytics.
  • This Data Analytics as a Service – model (DAaaS) should not behave as an application, but it should be available as a platform for application development.

The first statement on the cloud computing needs suggests we can expect analytics to become easily deployed, widely accessible and not depending on deep investments by single organizations; ‘as a service’ implies relatively low cost and certainly a flexible usage model.

The second statement about the platform capability of data analytics however, has far reaching consequences for the way we implement and build the analytic capabilities for large data collections.

Architecturally, and due to the intrinsic complexities of analytical processes, the implementation of DAaaS represents an important set of challenges, as it is more similar to a flexible Platform as a Service (PaaS) solution than a more “fixed” Software as a Service (SaaS) application

It is relatively easy to implement a single application that will give you an answer to a complex question; many of the applications for mobile devices are built on this model (take for example the many applications for public transport departure, arrival times and connections).

This “1-application-1-question” approach is in my opinion not a sustainable business model for business environments; we need some kind of workbench and toolkit that is based on a stable and well defined service.

The white paper describes a proof of concept that has explored such an environment for re-usability, cloud aspects and flexibility. It also points to the technology used and how the technology can work together to create ‘Data Analytics as a Service’.


This blog post was previously published at http://blog.atos.net/blog/2013/03/25/watch-this-space-curiosity-drives-cloud-computing/


<

Would you like a cup of IT

The change in the IT landscape brought about through the introduction of Cloud Computing is now driving a next generation of IT enablement. You might call it Cloud 2.0, but the term 'Liquid IT' much better covers what is being developed.

In a recently published white paper by the Atos Scientific Community, Liquid IT is positioned not only as a technology or architecture; it is also very much focused on the results of this change on the business you are doing day to day with your customer(s).

"A journey towards Liquid IT is actually rather subtle, and it is much more than a technology journey"

The paper explains in detail how the introduction of more flexible IT provisioning, now done in real time allows for financial transparency and agility. A zero latency provisioning and decommissioning model, complete with genuine utility pricing based on actual resources consumed, enables us to drive the optimal blend of minimizing cost and maximizing agility. Right-sizing capabilities and capacity all of the time to the needs of the users will impact your customer relationship – but, very important, designing such a systems starts with understanding the business needs.

"Liquid IT starts from the business needs: speed, savings, flexibility, and ease of use"

Existing examples of extreme flexibility in IT (think gMail, Hotmail or other consumer oriented cloud offerings) have had to balance between standardization and scale. The more standard the offering, the more results in scaling can be achieved. This has always been a difficult scenario for more business oriented applications. The paper postulates that with proper care for business needs and the right architecture, similar flexibility is achievable for business processes.

Such a journey to 'Liquid IT' indeed includes tough choices in technology and organization, but also forces the providers of such an environment to have an in-depth look at the financial drivers in the IT provisioning and the IT consumption landscape.

"The objectives of financial transparency dictate that all IT services are associated with agreed processes for allocation, charging and invoicing"

There are two other aspects that need to change in parallel with this move to more agility in IT; the role of the CIO will evolve and the SLA that he is either buying or selling will change accordingly.

Change management will transform into Information Management as the use of IT as a business enabler is no longer the concern of the CIO. IT benchmarking will become a more and more important tool to measure the level of achieved agility for the business owners. The focus on the contribution to the business performance will be measured and needs to be managed in line with business forecasts.

The white paper authors conclude that "Business agility is the main result of Liquid IT" – sounds like a plan!

This blog post was previously published at http://blog.atos.net/blog/2013/03/08/watch-this-space-would-you-like-a-cup-of-it/


 

Ascent Journey 2016 – the future trends in IT and business explained



Ascentjourney2016buildingblockssmall
Atos just announced the publication of Ascent Journey 2016 – Enterprise Without Boundaries

"Ascent Journey 2016 is a unique and comprehensive document where Atos’ Scientific Community presents its predictions and vision for the technology that will shape business through to 2016.

It builds on Journey 2014 – Simplicity with Control and is enriched by the new challenges which have now
emerged in reshaping both society and business alike.

Our research suggests that the convergence of key issues affecting Demographics, Globalization and Economic Sustainability, underpinned by Trust, will see a new way of working emerge in which traditional barriers no longer exist, but where security and privacy are more important than ever."

Exiting stuff and I am honoured to say I was part of the editorial board who produced this document.

More information and download here.

Press release here.

 

 

The PaaS cloud computing lock-in and how to avoid it

Cloud Computing changed from choosing an easy solution, into making a difficult decision.

The reason is the proliferation of cloud offerings at all layers; today we do not only find ‘everything-as-a-service’ cloud solutions, but also ‘everything-is-tailored-for-your-specific-situation-as-a-service’ tagged as cloud solutions.

Is this good? I do not think so.

My main objection is that you will end up with a cloud solution that is no different than any solution you have previously designed and installed yourself, at a cheaper rate and lower quality SLA.

True cloud solutions should not only focus on cost reduction, increased agility and flexible capabilities. You should also be buying something that supports portability between the private and public computing domain, and across different vendor platforms.

In early cloud solutions, mainly the ones focussing on Infrastructure-as-a-service, this portability has been heavily debated (remember the ‘Open Cloud Manifesto’?) and in the end we concluded that server virtualization solved a lot of the portability issues (I am simplifying of course).

We also had Software-as-a-service and some publications showed that the portability could be addressed by looking at standardized business process definitions and data normalisation (again, I am simplifying).
Now the Atos Scientific Community has published a whitepaper that looks at the most complex form of cloud computing; Platform-as-a-service.

PaaS offerings today are diverse, but they share a vendor lock-in characteristic. As in any market for an emerging technology, there is a truly diverse array of capabilities being offered by PaaS providers, from supported programming tools (languages, frameworks, runtime environments, and databases) to various types of underlying infrastructure, even within the capabilities available for each PaaS


So a common characteristic that can be extracted of all this diversity is the fact of PaaS users currently are being bound to the specific platform they use, making the portability of their software (and data) created on top of these platforms difficult.

As a result we see a slow adoption of PaaS in the enterprise; only those groups that have a very well defined end-user group are looking at PaaS – and mostly for the wrong reason: ‘just’ cost saving through standardization.

In the Atos Scientific Community whitepaper they are identified as:

Two primary user groups which benefit from using Cloud at the Platform as a Service level: Enterprises with their own internal software development activities and ISVs interested in selling SaaS services on top of a hosted PaaS.”


The current situation where PaaS is mostly resulting in a vendor lock-in scenarios is holding back the full potential for applications on a PaaS.

By introducing a general purpose PaaS, we would allow a comprehensive, open, flexible, and interoperable solution that simplifies the process of developing, deploying, integrating, and managing applications both in public and private clouds.

Such an architecture is proposed and explained in detail in the whitepaper; it describes the desired capabilities and building blocks that need to be established and it also offers an analysis of market trends and existing solutions, in order to establish a future vision and direction for PaaS, as well as outlining the business potential of such a solution.

We can all continue to feel positive about the power and the business potential of cloud computing.

Changing your cost base from capex to opex, increasing your speed in your go-to-market strategies and the flexibility in capacity and location are very important for your business.

We should not however confuse vendor specific solutions with cloud solutions only because they promise flexibility in cost and easy deployment; being able to shift and shop around is always better – also in cloud computing.


This blog post is a repost of http://blog.atos.net/sc/2012/10/15/watch-this-space-the-paas-cloud-computing-lock-in-and-how-to-avoid-it/


 

Cloud Orchestration and your Interoperability Strategy

English: Diagram showing three main types of c...

English: Diagram showing three main types of cloud computing (public/external, hybrid, private/internal) (Photo credit: Wikipedia)

Life for a CIO or CTO used to be complex.

Then came Cloud Computing.

Now life is even more complex.

A year ago the Atos Scientific Community published a whitepaper on Cloud Computing. In the paper the concept was explained and we predicted that interoperability among clouds was going to be a major headache.

The paper also showed the result of a proof of concept in which we connected multiple private and public clouds to perform a single business workflow.

The hypothesis of the paper is that organizations will end up with multiple cloud environments:

“This will be driven by what is most fit for purpose for any given application (or part of it), based on an SLA trade-off between cost and business criticality. The corporate application landscape will therefore also fragment into those layers and into many business processes, requiring access to multiple applications and data connections that will need to span those layers. Unless enterprises consider these implications in advance, they risk building a heterogeneous IT infrastructure, only to discover that their key business processes can no longer be plugged together or supported.”

I think the authors (full disclosure: I was one of them) were right in their assumption and the situation nowadays is not any better than 1 year ago.

There are a couple of reasons I wanted to bring this to your attention again.

First because the paper has been re-launched on www.Atos.net , secondly because the paper has been accepted as a submission to the yearly Internet conference WWW2012 (www.www2012.org ) and thirdly because on February 10, 2012 the United Nations announced they will take initiatives to “Aim for Cloud Interoperability”.

At least for me this was a surprise as I saw the UN mainly as an intergovernmental body looking to create lasting world peace.

But if you think this through it actually makes sense. The UN’s International Telecommunication Union (source: www.itu.int) “is committed to connecting all the world’s people – wherever they live and whatever their means. Through our work, we protect and support everyone’s fundamental right to communicate.” And there is a lot more on vision, collaboration, achieving standards etcetera, etcetera.

During the January meeting of the Telecommunication Standardization Advisory Group an initiative has been taken to start a study on this subject of cloud interoperability.

Apparently this was done on request of “leading CTO’s” to “investigate the standardization landscape in the cloud computing market and pursue standards to lead to further commoditization and interoperability of clouds”. (Source: ITU-T Newslog January 17, 2012).

This is good news and it is not the only initiative that came out recently.

The Organization for the Advancement of Structured Information Standards (OASIS), previously known as SGML OPEN, has started a technical committee on “Topology and Orchestration Specification for Cloud Applications” (TOSCA) aiming to make it easier to deploy cloud applications without vendor lock-in, “…while maintaining application requirements for security, governance and compliance” (source: www.oasis-open.org/news/pr/tosca-tc).

The newly formed committee is supported by vendors like CA, Cisco, EMC, IBM, Red Hat, SAP and others.

In addition I recently Googled “Cloud interoperability” and when filtering for the last month, I got about 60.000 hits, so I can say that the subject is very well alive on the internet.

The point of all this? I firmly believe that in addition to your cloud strategy, you need to have a cloud interoperability strategy. You need to be aware of emerging standards and you need to talk to your vendor about it.

It is inevitable that some parts of your business will be run “in the cloud”; nowadays it is not only important how to get there, but also how to (securely) stay there while maintaining the flexibility to move around, interconnect your processes and still take end-to-end responsibility.

Like I said at the beginning. Life gets more complex.

 


[This blog post is a repost of http://blog.atos.net/sc/2012/03/02/cloud-orchestration-and-your-interoperability-strategy/]

The whitepaper can be downloaded here