Cloud Orchestration and your Interoperability Strategy

English: Diagram showing three main types of c...

English: Diagram showing three main types of cloud computing (public/external, hybrid, private/internal) (Photo credit: Wikipedia)

Life for a CIO or CTO used to be complex.

Then came Cloud Computing.

Now life is even more complex.

A year ago the Atos Scientific Community published a whitepaper on Cloud Computing. In the paper the concept was explained and we predicted that interoperability among clouds was going to be a major headache.

The paper also showed the result of a proof of concept in which we connected multiple private and public clouds to perform a single business workflow.

The hypothesis of the paper is that organizations will end up with multiple cloud environments:

“This will be driven by what is most fit for purpose for any given application (or part of it), based on an SLA trade-off between cost and business criticality. The corporate application landscape will therefore also fragment into those layers and into many business processes, requiring access to multiple applications and data connections that will need to span those layers. Unless enterprises consider these implications in advance, they risk building a heterogeneous IT infrastructure, only to discover that their key business processes can no longer be plugged together or supported.”

I think the authors (full disclosure: I was one of them) were right in their assumption and the situation nowadays is not any better than 1 year ago.

There are a couple of reasons I wanted to bring this to your attention again.

First because the paper has been re-launched on www.Atos.net , secondly because the paper has been accepted as a submission to the yearly Internet conference WWW2012 (www.www2012.org ) and thirdly because on February 10, 2012 the United Nations announced they will take initiatives to “Aim for Cloud Interoperability”.

At least for me this was a surprise as I saw the UN mainly as an intergovernmental body looking to create lasting world peace.

But if you think this through it actually makes sense. The UN’s International Telecommunication Union (source: www.itu.int) “is committed to connecting all the world’s people – wherever they live and whatever their means. Through our work, we protect and support everyone’s fundamental right to communicate.” And there is a lot more on vision, collaboration, achieving standards etcetera, etcetera.

During the January meeting of the Telecommunication Standardization Advisory Group an initiative has been taken to start a study on this subject of cloud interoperability.

Apparently this was done on request of “leading CTO’s” to “investigate the standardization landscape in the cloud computing market and pursue standards to lead to further commoditization and interoperability of clouds”. (Source: ITU-T Newslog January 17, 2012).

This is good news and it is not the only initiative that came out recently.

The Organization for the Advancement of Structured Information Standards (OASIS), previously known as SGML OPEN, has started a technical committee on “Topology and Orchestration Specification for Cloud Applications” (TOSCA) aiming to make it easier to deploy cloud applications without vendor lock-in, “…while maintaining application requirements for security, governance and compliance” (source: www.oasis-open.org/news/pr/tosca-tc).

The newly formed committee is supported by vendors like CA, Cisco, EMC, IBM, Red Hat, SAP and others.

In addition I recently Googled “Cloud interoperability” and when filtering for the last month, I got about 60.000 hits, so I can say that the subject is very well alive on the internet.

The point of all this? I firmly believe that in addition to your cloud strategy, you need to have a cloud interoperability strategy. You need to be aware of emerging standards and you need to talk to your vendor about it.

It is inevitable that some parts of your business will be run “in the cloud”; nowadays it is not only important how to get there, but also how to (securely) stay there while maintaining the flexibility to move around, interconnect your processes and still take end-to-end responsibility.

Like I said at the beginning. Life gets more complex.

 


[This blog post is a repost of http://blog.atos.net/sc/2012/03/02/cloud-orchestration-and-your-interoperability-strategy/]

The whitepaper can be downloaded here


The Ultimate Question of life, the Universe, and Everything

[This blog post is a repost of http://blog.atos.net/sc/2012/01/21/ultimate-question-of-life-universe-and-everything/ ]

If you are thinking about the number 42 after reading the title of this blog entry, I compliment you about your knowledge of classic science fiction literature – for you there is no reason to read on as you already know everything.

The vision in Journey 2014: Challenges and BuildingblocksFor all others, please keep reading because I am about to give you access to a better answer. In late 2009 a group of smart people in Atos sat together and defined 10 challenges for our IT industry that will play an important role in the coming 5 years. Each of these challenges were thoroughly discussed and examined. The reason we did this was to support Atos in its changing strategy to become a more global organization with a clear view on the future . Since then the results have also been shared with customers and in 2011 the result was bundled in the book “Journey 2014” (it is available as a download on the Atos website).

There is no particular order in the priority of the challenges that was set, so I will present them in alphabetical order and quote from the book to give you a preview of the conclusions – after that I will give you a view on how this all comes together;  

1. Alternative Delivery Models

“Organizations should make rapid progress on realizing the benefits of cloud services…” “Cloud computing is such a broad and diverse phenomenon that it is easy to become confused about its many forms and the way organizations can benefit…”

2. Business Process Management

“…Within 3 to 5 years, Business Process Management will become the dominant process change tool used by business stakeholders, working at two levels: first on Business Process within an organization (Orchestration) but also considering End to End processes involving interaction among different players (partners, customers and suppliers) and their systems (Choreography).…”

“A close eye has to be kept on the BPMN 2.0 evolution which may address BPEL and BPMN 1.0 short comings…” “An increasing number of BPM vendors are starting to offer BPM software-as-a-service (BPMSaaS). BPM services represent the highest level in the Cloud services. BPMaaS provides the complete end to end business process management needed for the creation and follow-on management of unique business processes.”

3. Context Aware Computing

“The Hyper Inter-Connected world faces an even greater challenge (…) to make sense of the literally trillions of data sources that could influence any given situation. Coupling this with the maturing of the smart phone (…) it paves the way for a new generation of intelligent applications that adapt to the user’s context on time to enrich the delivered experience…”

“…services enabled by context aware computing will anticipate and react to the needs of user, providing relevant, useful information to be able to make better informed decisions. These services will supersede the existing (…) applications and revolutionize how providers interact with consumers, organizations with employees, governments with employees and people with their social networks.”

4. Collaboration

“It is time for companies to catch up and stop ignoring modern collaboration methods that have proved to be very effective in the consumer world. The same way that social networks connect people with common interests, organizations have to take advantage of these solutions to connect people for a given purpose. It is not only a matter of cost saving it is also about improving the Decision Process, empowering employees and reaching consistent and supported consensus.”

“Information Management remains a key priority for enterprises to compete in local and global markets and collaboration is expected to generate even more strategic information which will need to be managed."

5. Control and Command

“Several strategies are being devised to synthesize a large system into a not-too-complex model, such as filtering events based on relevance, or aggregating data at different hierarchical level. Dealing with events coming out too fast is a stressful situation where an operator is more likely to make a mistake. Providing him with the appropriate information, at the right time and the right level of detail is a requirement to have him make an informed decision in time.”

“As the next generation of connected devices has started coalescing into an Internet of Things, control-command techniques will be required to bridge gaps and monitor the massive amount of information these will generate.”

6. Decision Support

“Decision Support has to deal with huge amounts of information, often unstructured, that change dynamically, and whose relevance and timeliness depend on the problem to be solved.” “By combining Business Intelligence capability for analytical insights and measures with collaboration tools and social software, they allow decision on no
n-structured problems to be made in a collective way.”

7. Electronic Entertainment and Gaming

“Media consumers tend to become actors while consuming media, which has an important impact on the way media is consumed and edited..” “The trends and technologies developed for the electronic entertainment and gaming market tend to gain other markets, benefitting from the mass market effect to become affordable in the industrial or business world.”

8. Green IT

“The know-how obtained in these practical experiences, if appropriately transferred, would enable IT departments and IT companies to accelerate their capability to serve clients in designing, engineering and operating IT for Green services” “There is a need for Business Transformation capabilities to manage the necessary behavioral change to leverage benefits from Green for IT and IT for Green.”

9. Social Networking

“Effectively using social platforms will be a key objective for companies coping with changing customer and employee relations…” “Creating an reward program to an agile, social, engagement that boosts user interaction is not so much a technical as a philosophical or political problem, going from authority to collaboration, from obscurity to transparency, from direct marketing to community management.”

10. Working Environment

“For the foreseeable future, offshoring will remain an effective strategy for reducing cost of service delivery and hence attracting and retaining talent is an issue that equally applies to offshore locations. Organizations must extend the working environment vision to apply to offshore locations.”

“Organizations will have to go beyond traditional financial incentives as the majority of employees look beyond money to find a meaning for their lives. With work life encroaching on home life, benefits from employers must reflect personal needs too.”

Bringing it all together When we look at the various challenges, and BTW there is much more info in the book, there is a need to understand how we can connect the dots – what is the overall idea or even vision that drives our behavior to these challenges. While we were discussing all of the different components it became very clear that 2 things are at the heart of our preferred way of interacting with the challenges. Handling the results should be simple and allow for a level of control.

This statement of “Simplicity with Control” became a mantra for further investigation and has driven many proof of concepts since.  

The second point of clarity came when we made the decision to put the user at the heart of our set of challenges (and the underlying building blocks). Through collaboration and social networking, the user wants to reach its objectives. If we look at the challenges in this way we conclude that they are not about solving technology questions, but about addressing the user’s needs.  

By combining simplicity, control and the needs of the user we have defined the starting point and the context for answering the question that is in the title of this blog. The philosophical statement is that the answer lies within ourselves; and to be honest, I prefer it that way. 


The Atos Journey 2014 whitepaper can be downloaded

here


Big Data – Big Problems?

International Bibliography of Periodical Liter...

International Bibliography of Periodical Literature (Photo credit: Wikipedia)


[This blog post is a repost of http://blog.atos.net/sc/2011/12/09/watch-this-space-big-data-%e2%80%93-big-problems/ ]


When you run out of space in your cupboard, you go out and buy a new cupboard. You might even choose a similar model so it looks good in your bedroom or kitchen.

If you run out of floor-space in your house, the problem is a bit more complex from a financial point of view – but the solution is similar.

I think we had, for many years, the same expectation in IT. If we ran out of storage, we would buy additional storage. Well, it seems we need to wake up and face the problem, because the solution is not that simple anymore. In a published whitepaper from the Atos Scientific Community (“Open Source Solutions for Big Data Management”) I read:

“[…] several major changes in the IT world have dramatically increased [data storage and processing needs] rate of growth.

[…] Computer capabilities have not increased fast enough to meet these new requirements. When data is counted in terabytes or petabytes, traditional data and computing models can no longer cope.”

This problem forces us to have a different view on storage and database technologies.

Traditional databases that use a relational model cannot process the data quick enough and adding additional computing power and memory is not the solution.

The issue is luckily addressed by storage and database vendors – they coined the term “Big Data” and are developing new solutions to make sure we can cope with the rapid increase in the information we want to be available online.

Unfortunately the impact of these new technologies is big (no pun intended) and there is limited experience in the way the technology is applied successfully and sustainable.

Some vendors are looking towards changes in hardware and provide dedicated storage-boxes that are hardwired to handle large databases or large data-files. Others are looking to provide solutions using new database software.

Most of the software developers and vendors that are facing big data issues are reconsidering the ‘traditional’ relational database model and are bringing new ‘NoSQLdatabase models into view.

Based on the amount of marketing and buzz , this ‘NoSQL’ seems to be the next best thing to go with for these type of solutions.

So, do we really need all of this stuff? The Scientific Community whitepaper claims:

“In most situations, using NoSQL solutions instead of RDBMS (relational database management systems – paj) does not make sense in cases where the limits of the database have not been reached. Although, given the current exponential growth of data storage requirements, these limits are increasingly likely to be reached in the future. Unless RDBMS evolves quickly to include more advanced data distribution features, NoSQL solutions will become more and more important.”

The specialists have spoken – it is important. We need to care and we need to take action.

Additional problem is that the field is evolving quickly, good solutions are provided by small companies and will soon become part of large providers through acquisitions or other business activities.

I also expect some patent-conflicts (do we not love those?) and maybe some bad choices leading to loss of data.

My recommendation is you start looking for areas in your organization where this challenge will become a problem very soon. Ask your systems administrator about the time they need to do backups of databases and restore times. Ask your system developers if they foresee issues with your next generation document management or transaction processing system.

And while you are at it – ask your business analyst about the data they need to create meaningful business intelligence reports (and how much time it takes to create them). This will give you a good overview of your Big Data improvement areas.

Do not ask your vendor before having done an internal assessment. You do not want to be stuck with the wrong technology.


The Atos whitepaper can be downloaded here


Enhanced by Zemanta
Technorati Tags: ,,,

Data-centers may be too hot to handle


[This blog post is a repost of http://blog.atos.net/sc/?p=298 ]


Global warming is certainly a topic that the IT industry should care about, especially if you consider the way we contribute to the rising CO2 levels and power consumption. Good news, though; apparently we do already care about it.After his earlier report in 2008, on August 1st, 2011, a follow up report was presented by Jonathan Koomey, a consulting professor at Stanford University.

In his latest report , Koomey states:

“In summary, the rapid rates of growth in data-center electricity use that prevailed from 2000 to 2005 slowed significantly from 2005 to 2010, yielding total electricity use by data-centers in 2010 of about 1.3% of all electricity use for the world, and 2% of all electricity use for the US.”

Still, that is a lot of power and since we can expect a growth in the amount of data-center space, we need to spend considerable time thinking about further lowering the CO2 footprint of data-centers. A white-paper by the Atos Scientific Community, takes an interesting view to the subject. The authors claim that the two worlds of data-centers and of 'Control and Command' have lived up to now in relatively separate spaces, although:

“… the data-center can be seen as an industrial equipment with processes like electricity management, temperature control, humidity control, physical security, and finally the management of IT equipment themselves…”

After a short description of technology that is deployed in data-centers, it is concluded that:

“…computers kept in the room are not the dumb heaters that the physics rules would describe. They have their own operational rules and constraints, acting at a logical level rather than a physical one, and providing software solutions to manage that logical level. A communication channel between the computers or blades and the Control and Command could be mutually beneficial.”

This introduces a new way to look at datacenters, using end-to-end monitoring to manage the facility as a whole, not as a collection of separate components. The fact that all components interact and should be managed as such opens new ways to bring power consumption under control. A better future is possible, but a lot of work still needs to be done.


The Atos whitepaper can be downloaded here.


Enhanced by Zemanta