The challenging future of the data center in an IoT landscape

Disclosure: This post was previously published on Atos Ascent Blog Post and was co-authored by Mr. Andrea Sorrentino (LinkedIn) – minor format and content edits have been applied to fit it to this website.

“Whosoever desires constant success must change his conduct with the times.” This phrase of Machiavelli Wikipedia perfectly aligns with how we should consider the Internet of Things (IoT) – the need to change our mindset regarding the IT industry and how we use data. Better analytics are now creating an amount of information which has never been obtained before; providing important insights into markets and consumers. The IoT can further enhance the business value extrapolated from data, and we find ourselves in the early phase of development of this new technology that will shape our vision of the world.

Now, imagine a company that distributes millions of sensors along its production chain in several factories, all sending data about machinery to a central location. On one hand, managers will have access to a large amount of data which can effectively contribute to help correct inefficiencies, and to create business value. McKinsey estimate that if policy makers and businesses get it right, linking the physical and digital worlds could generate up to $11.1 trillion a year in economic value by 2025. On the other hand, the data center involved would probably very quickly reach its processing capacity, as it would be overloaded with data and connections that are being pushed from the sensors. According to Gartner, it would not be technically and economically feasible to maintain every computing activity in a central location with the IoT.

The impact of the IoT

The IoT will have a huge influence on companies’ data center strategies, and the best option is likely to be creating a distributed data center infrastructure, installing smaller facilities close to the devices for local processing, with further aggregation in a central location. This creates a more flexible management system which can be adapted to changing requirements. The old logic of using a centralized data center to reduce costs and increase security is simply not compatible in the IoT era. However, any strategy is dependent on the smartness of the devices being used to filter data and avoid overloading the entire system to prevent inefficiencies.

The adoption of the IoT will likely lead to a profound reassessment of data management strategies within businesses, and aspects such as costs and the integration of new technology are hot topics for managers today. The IoT represents a great opportunity for creating smarter companies that are more responsive to market needs. It enhances capabilities that, decades ago, managers could barely imagine: real time analytics that allow for preemptive intervention to avoid potential errors.

Therefore, implementing IoT solutions is important to be able to create a tailored data management strategy, re-considering the role of the data center for a business. The IoT will likely speed up the transitional process to cloud-oriented infrastructure; companies in different sectors are already gradually running a larger part of their processes on hybrid cloud solutions. The cloud is an enabler of digital transformation which can enhance the potential of the entire infrastructure, and support in delivering better services.

A future for the traditional data center?

The advancement of IoT and cloud computing may lead to the reduction in the use of data centers by businesses, simply due to the potential level of scalability and flexibility that companies may need to attain. Clearly, security cannot be underestimated and companies need to maintain a robust infrastructure around their data. It is likely that data centers will gradually lose strategic importance for most businesses, however physical locations will still be needed as safe stations of reference in case of system failures.

IT managers need to begin thinking about the best approach to optimize and innovate their infrastructure, ensuring it doesn’t become quickly outdated in a fast moving environment, enabled by the IoT.

 

Like Shopping? Prepare for something better.

shoppingThere are 3 reasons I never go shopping without my smart phone; first I need to be able to compare the price of what is on sale with the price I would pay elsewhere, secondly, I like to see a review of the product on-line and thirdly I need to be able to call my wife when I am in doubt about what kind of groceries, or some other unknown item written on her shopping list (female hygiene products are always challenging for me).

The shopping experience has suffered a dramatic change over the last decades. Offers are larger and more diversified than ever, globalization is a reality and e-commerce is growing exponentially. Buyers are more demanding, discerning and sophisticated while the traditional selling models are not good enough to secure a sustainable sales flow.”

This change in shopping, fueled by mobile technologies and a much deeper understanding of the customers behaviors and demands is the scope of a white paper download, called “The Future of In-Store Shopping”.

Physical shopkeepers, as explained in the paper, are increasingly under pressure to compete with the e-commerce world in order to provide an experience that has the same convenience of shopping on-line while at the same time offer the intimacy and customer satisfaction of getting to touch and discuss a product.

The answer lies in putting the customer at the center of the value chain through an enhanced shopping experience. Whenever customers interact with the commerce, a new opportunity arises to know them better and offer a more personalized service, which could extend up to negotiating prices on a one-to-one basis.”

New shopping models will be needed to capture the client and bring the value of being in the shop, while at the same time the convenience of electronic payment and delivery is combined with the physical shop experience. Possible scenario’s include personalization but also increase the experience through show casing of product ranges and providing expert support during the decision making process.

The reason for being in a store can be further enhanced by making it part of a full end-to-end experience that can even start before you go into the shop. Something we used to do by sending around leaflets of this week’s offerings, but can now become a much more sophisticated and personal experience through data analytics of previous purchases or engaging the customer in communities – this ‘value-flow’, that can even include a post-shopping experience, is explained in detail and allows you to understand how you can set this up yourself.

The better the retailers take care after a purchase, considering it the ‘purchase before the next purchase’, the more likely they are to have won happy and frequent customers.”

Technology will support this change. New payment methods, using mobile devices (we have talked about this before in my blogs and a white paper dedicated to mobile payments is also available) are increasingly available. But other technologies such as geo-location and in-store routing allow consumers to find stores and even navigate to specific locations inside the store. Big Data Analytics and all types of product identification through smart labelling, NFC or bar codes will help us track both the consumer and the products inside the store and beyond. Better and ‘always-on’ connectivity will support high enough bandwidth to enrich the physical product with lots of additional (meta-) data to give the customer even more information.

Initially consumers will start using basic functionalities (find a store, make a shopping list, get product information, etc.) and once they feel confident and see the value, they will access more complex functionalities (make a shopping basket, self-checkout, mobile payment, cloud tickets, etc.). It is important that all these functions are easy to use and they are designed with the consumer at the center, hiding the complexity of the technologies being used (NFC, image recognition, indoor location, etc.)”

And when we look further in the future we will see possibilities for consumers to get access to the full product life cycle – where was this chair made, what is the origin of this coffee, what are the ingredients of this pizza? The full ecological footprint will be available regarding the actual product you are touching and putting in your basket. On top of that, using augmented reality the shop can adapt itself to your mood, informing the staff that you are open for suggestions or want to be left alone.

Ultimately, what will make stores interesting in the future is the same thing that makes them interesting today: the physical experience of being there, talking to real people who know their products, touching such products and the unbeatable joy at leaving the store with the product in your hands.”

The paper gives you a comprehensive overview and is a good starting point to understand how customer expectations, technology and the way retailers like to organize their physical business comes together. And this is not far away in the future as I experienced recently when my favorite on-line retailer just now opened a physical store in my home town – interestingly the location of the store was the result of asking their on-line customers to find the best spot for them. I’m sure they saved a lot of money because they did not need to hire a specialist, locating the perfect location was outsourced to their customers – in my book, that is clever thinking.


This blog post was previously published at  http://blog.atos.net/blog/2013/11/29/watch-this-space-like-shopping-prepare-for-something-better/ 


Curiosity drives cloud computing

I like asking questions and I like getting good answers even better. It is because of that, I now have a love / hate relationship with search engines. Most of the time they give me a 50% answer, a kind of direction, a suggestion, a kind of coaching to the real answer. It is like the joke about the consultant; “the right answer must be in there somewhere, because he or she gives me so many responses”.

PH03797IIn spite of all kind of promises, search engines have not really increased their intelligence. Complex questions with multiple variables are still nearly impossible to get answered and the suggestions to improve my question are mostly about my spelling or because the search engine would have liked a different subject to be questioned on.

So nothing really good is coming from search engines then? Well most arguably search engines have brought us cloud computing and a very powerful access to lots and lots and lots of data, otherwise known as ‘the world wide web’.

No wonder I envision that powerful access and cloud computing are the two most important values we want to keep while increasing the capacity and intelligence to do real analytics on large data sets.

In a whitepaper of the Atos Scientific Community, these 2 elements are explored in great depth:

  • Data Analytics needs cloud computing to create an “Analytics as a Service” – model because that model addresses in the best way how people and organizations want to use analytics.
  • This Data Analytics as a Service – model (DAaaS) should not behave as an application, but it should be available as a platform for application development.

The first statement on the cloud computing needs suggests we can expect analytics to become easily deployed, widely accessible and not depending on deep investments by single organizations; ‘as a service’ implies relatively low cost and certainly a flexible usage model.

The second statement about the platform capability of data analytics however, has far reaching consequences for the way we implement and build the analytic capabilities for large data collections.

Architecturally, and due to the intrinsic complexities of analytical processes, the implementation of DAaaS represents an important set of challenges, as it is more similar to a flexible Platform as a Service (PaaS) solution than a more “fixed” Software as a Service (SaaS) application

It is relatively easy to implement a single application that will give you an answer to a complex question; many of the applications for mobile devices are built on this model (take for example the many applications for public transport departure, arrival times and connections).

This “1-application-1-question” approach is in my opinion not a sustainable business model for business environments; we need some kind of workbench and toolkit that is based on a stable and well defined service.

The white paper describes a proof of concept that has explored such an environment for re-usability, cloud aspects and flexibility. It also points to the technology used and how the technology can work together to create ‘Data Analytics as a Service’.


This blog post was previously published at http://blog.atos.net/blog/2013/03/25/watch-this-space-curiosity-drives-cloud-computing/


<

A new business model in 3 easy steps

If you like curly fries you are probably intelligent (1).

This insight comes from the University of Cambridge. The researchers analysed the data from Facebook to show that ‘surprisingly accurate estimates of Facebook users’ race, age, IQ, sexuality, personality, substance use and political views can be inferred from the analysis of only their Facebook Likes’.

The possibility to collect large amounts of data from everyday activities by people, factory processes, trains, cars, weather and just about anything else that can be measured, monitored or otherwise observed is a topic that has been discussed in our blogs many times.

Sometimes indicated as ‘The Internet of Things’ or, with a different view ‘Big Data’ or ‘Total Data’, the collection and analysis of data has been a topic for technology observations and a source of concern and a initiator for new technology opportunities.

This blog is not about the concerns, nor is it about the new technologies. Instead it is about a view introduced by a new white paper by the Atos Scientific Community called “The Economy of Internet Applications”; a paper that gives us a different, more economic, view on these new opportunities.

Let’s take a look at a car manufacturer. The car he (or she) builds will contain many sensors and the data from those sensors will support the manufacturer to enable better repairs for that one car, it can provide data from many cars for an analysis to build a better car in the future and it can show information to the user of the car (speed, mileage, gas). The driver generates the data (if a car is not driven, there is no data) and both the driver and the car manufacturer profit from the result.

Now pay attention, because something important is happening: When the car manufacturer provides the data of the driver and the car combined to an insurance company, a new business model is created.

The user still puts in the data by using the car, the car manufacturer sensors in the car still collects the data, but the insurance company gets the possibility to do a better risk analysis on the driver’s behaviour and the cars safety record.

This would allow the insurance company to give the driver a better deal on his insurance, or sponsor some safety equipment in the car so there is less risk for big insurance claims in health or property damage.

It would allow the car manufacturer to create more value from data they already have collected and it would give the driver additional benefits in lower insurance payments or improved safeties.

What just happened is that we created a multi-sided market and it is happening everywhere.

“If you don’t pay for the product, you are the product”

The white paper explains it in more detail but the bottom line is that due to new capabilities in technology, additional data can easily be collected.

This data can be of value for different companies participating in such a data collection and the associated analytics platform.

Based on the economic theory of multisided markets, the different participants can influence each other in a positive way, especially cross sector (the so called network effect).

So there you have it, the simple recipe for a new business model:

  1. Find a place where data is generated. This could be in any business or consumer oriented environment. Understand who is generating the data and why.
  2. Research how: a. that data or the information in that data, can give your business a benefit and b. how data that you own or generate yourself, can enrich the data from the other parties.
  3. Negotiate the usage of the data by yourself or the provisioning of your data to the other parties.

In the end this is about creating multiple win scenarios that are based on bringing multiple data sources together. The manufacturer wins because it improves his product, the service provider wins because it can improve the service and the consumer wins because he is receiving both a better product and a more tailored service.

Some have said that Big Data resembles the gold rush (2) many years ago. Everybody is doing it and it seems very simple; just dig in and find the gold – it was even called ‘data-mining’.

In reality, with data nowadays, it is even better, if you create or participate in the right multi-sided market, that data, and thus the value, will be created for you. 

(1) http://www.cam.ac.uk/research/news/digital-records-could-expose-intimate-details-and-personality-traits-of-millions

(2) http://www.forbes.com/sites/bradpeters/2012/06/21/the-big-data-gold-rush/


This blog post was previously published at http://blog.atos.net/blog/2013/03/18/watch-this-space-a-new-business-model-in-3-easy-steps/


Would you like a cup of IT

The change in the IT landscape brought about through the introduction of Cloud Computing is now driving a next generation of IT enablement. You might call it Cloud 2.0, but the term 'Liquid IT' much better covers what is being developed.

In a recently published white paper by the Atos Scientific Community, Liquid IT is positioned not only as a technology or architecture; it is also very much focused on the results of this change on the business you are doing day to day with your customer(s).

"A journey towards Liquid IT is actually rather subtle, and it is much more than a technology journey"

The paper explains in detail how the introduction of more flexible IT provisioning, now done in real time allows for financial transparency and agility. A zero latency provisioning and decommissioning model, complete with genuine utility pricing based on actual resources consumed, enables us to drive the optimal blend of minimizing cost and maximizing agility. Right-sizing capabilities and capacity all of the time to the needs of the users will impact your customer relationship – but, very important, designing such a systems starts with understanding the business needs.

"Liquid IT starts from the business needs: speed, savings, flexibility, and ease of use"

Existing examples of extreme flexibility in IT (think gMail, Hotmail or other consumer oriented cloud offerings) have had to balance between standardization and scale. The more standard the offering, the more results in scaling can be achieved. This has always been a difficult scenario for more business oriented applications. The paper postulates that with proper care for business needs and the right architecture, similar flexibility is achievable for business processes.

Such a journey to 'Liquid IT' indeed includes tough choices in technology and organization, but also forces the providers of such an environment to have an in-depth look at the financial drivers in the IT provisioning and the IT consumption landscape.

"The objectives of financial transparency dictate that all IT services are associated with agreed processes for allocation, charging and invoicing"

There are two other aspects that need to change in parallel with this move to more agility in IT; the role of the CIO will evolve and the SLA that he is either buying or selling will change accordingly.

Change management will transform into Information Management as the use of IT as a business enabler is no longer the concern of the CIO. IT benchmarking will become a more and more important tool to measure the level of achieved agility for the business owners. The focus on the contribution to the business performance will be measured and needs to be managed in line with business forecasts.

The white paper authors conclude that "Business agility is the main result of Liquid IT" – sounds like a plan!

This blog post was previously published at http://blog.atos.net/blog/2013/03/08/watch-this-space-would-you-like-a-cup-of-it/