Your Future; Now available in real-time

cobblestones (road)Imagine you have an automatically and real-time updated agenda – it continuously adapts your schedule to meetings taking longer, predicts and updates in real-time your travel-time to the next meetings and will adapt your schedule because it ‘knows’ that typically any meeting with your best client always takes 30 minutes longer than you originally plan it for.

A proof of concept conducted by the Atos Scientific Community looked at this aspect of predictability and took the data of the traffic in the city of Berlin to see if it was possible to do real time traffic forecasting (RTTF). The result is in a recently published white paper.

  “RTTF enables a prediction (within 1 minute) of sensor data streams for the immediate future (up to four hours) and provides traffic condition classification for the upcoming time period based on the forecasted data.”

“The forecast provides a suitable time span for proactively managing upcoming incidents even before they appear.”

The team took a radical different approach to the challenges of today’s traffic management. Instead of proposing another reactive traffic management IT system with some smart analytics, the team targeted successfully a proactive traffic management approach which provides analytics solutions to predict critical events in advance before they appear.  Using historic data and artificial neuron network technology, predictions are created for the intermediate future and utilized to determine the traffic status of the upcoming next four hours. Based on that information, actions can be taken proactively to mitigate or avoid future upcoming events. Utilizing the software and bringing in data scientists with an understanding of the context was the next step. This helped in defining the right parameters and a pattern based strategy (PBS) in place.

“Being able to identify patterns out of the existing data, model them into patterns and come up with a system that can provide reliable predictions is a remarkable achievement in itself, but the true value of PBS is being able to apply such capabilities to strategy definition and decision making.”

Working with the subject matter experts the team identified multiple models that were then consequently implemented in the software. The models are important, they avoid that you are trapped into simplification; when a car is driving slowly, it can be because of a traffic jam, but it can also be an older person driving more carefully.

By introducing the concept of ‘flow’ – the number of vehicles passing a sensor each hour – the team could identify 4 different states, which were in themselves also parameterized by looking at road capacity, speed limits, etc. This information is then fed into a look-up table based complex event processing engine in order to predict, within 1 minute, the traffic situation at given locations.

Because in real-life the historic data is continuously refreshed with the actual events of the past time, the system will be able to predict in real-time the situation on the road.

The proof of concept clearly showed that a self-learning system, combined with a complex event processing unit and the help of some subject matter expert data scientist can accurately predict the future – the white paper shows this in some great details.

  “Real Time Traffic Forecasting is an excellent example of how data sources and identified patterns can be exploited to gain insights and to develop proactive strategies to deal with upcoming events and incidents. It enables a short term view into the future which is long enough to act on predicted incidents rather than react on occurring ones”

For me this proof of concept shows the benefits of data analytics in everyday life, and I am looking forward to this future.


This blog post was previously published at http://blog.atos.net/blog/2013/12/12/watch-this-space-your-future-now-available-in-real-time/ 


Curiosity drives cloud computing

I like asking questions and I like getting good answers even better. It is because of that, I now have a love / hate relationship with search engines. Most of the time they give me a 50% answer, a kind of direction, a suggestion, a kind of coaching to the real answer. It is like the joke about the consultant; “the right answer must be in there somewhere, because he or she gives me so many responses”.

PH03797IIn spite of all kind of promises, search engines have not really increased their intelligence. Complex questions with multiple variables are still nearly impossible to get answered and the suggestions to improve my question are mostly about my spelling or because the search engine would have liked a different subject to be questioned on.

So nothing really good is coming from search engines then? Well most arguably search engines have brought us cloud computing and a very powerful access to lots and lots and lots of data, otherwise known as ‘the world wide web’.

No wonder I envision that powerful access and cloud computing are the two most important values we want to keep while increasing the capacity and intelligence to do real analytics on large data sets.

In a whitepaper of the Atos Scientific Community, these 2 elements are explored in great depth:

  • Data Analytics needs cloud computing to create an “Analytics as a Service” – model because that model addresses in the best way how people and organizations want to use analytics.
  • This Data Analytics as a Service – model (DAaaS) should not behave as an application, but it should be available as a platform for application development.

The first statement on the cloud computing needs suggests we can expect analytics to become easily deployed, widely accessible and not depending on deep investments by single organizations; ‘as a service’ implies relatively low cost and certainly a flexible usage model.

The second statement about the platform capability of data analytics however, has far reaching consequences for the way we implement and build the analytic capabilities for large data collections.

Architecturally, and due to the intrinsic complexities of analytical processes, the implementation of DAaaS represents an important set of challenges, as it is more similar to a flexible Platform as a Service (PaaS) solution than a more “fixed” Software as a Service (SaaS) application

It is relatively easy to implement a single application that will give you an answer to a complex question; many of the applications for mobile devices are built on this model (take for example the many applications for public transport departure, arrival times and connections).

This “1-application-1-question” approach is in my opinion not a sustainable business model for business environments; we need some kind of workbench and toolkit that is based on a stable and well defined service.

The white paper describes a proof of concept that has explored such an environment for re-usability, cloud aspects and flexibility. It also points to the technology used and how the technology can work together to create ‘Data Analytics as a Service’.


This blog post was previously published at http://blog.atos.net/blog/2013/03/25/watch-this-space-curiosity-drives-cloud-computing/


<

A new business model in 3 easy steps

If you like curly fries you are probably intelligent (1).

This insight comes from the University of Cambridge. The researchers analysed the data from Facebook to show that ‘surprisingly accurate estimates of Facebook users’ race, age, IQ, sexuality, personality, substance use and political views can be inferred from the analysis of only their Facebook Likes’.

The possibility to collect large amounts of data from everyday activities by people, factory processes, trains, cars, weather and just about anything else that can be measured, monitored or otherwise observed is a topic that has been discussed in our blogs many times.

Sometimes indicated as ‘The Internet of Things’ or, with a different view ‘Big Data’ or ‘Total Data’, the collection and analysis of data has been a topic for technology observations and a source of concern and a initiator for new technology opportunities.

This blog is not about the concerns, nor is it about the new technologies. Instead it is about a view introduced by a new white paper by the Atos Scientific Community called “The Economy of Internet Applications”; a paper that gives us a different, more economic, view on these new opportunities.

Let’s take a look at a car manufacturer. The car he (or she) builds will contain many sensors and the data from those sensors will support the manufacturer to enable better repairs for that one car, it can provide data from many cars for an analysis to build a better car in the future and it can show information to the user of the car (speed, mileage, gas). The driver generates the data (if a car is not driven, there is no data) and both the driver and the car manufacturer profit from the result.

Now pay attention, because something important is happening: When the car manufacturer provides the data of the driver and the car combined to an insurance company, a new business model is created.

The user still puts in the data by using the car, the car manufacturer sensors in the car still collects the data, but the insurance company gets the possibility to do a better risk analysis on the driver’s behaviour and the cars safety record.

This would allow the insurance company to give the driver a better deal on his insurance, or sponsor some safety equipment in the car so there is less risk for big insurance claims in health or property damage.

It would allow the car manufacturer to create more value from data they already have collected and it would give the driver additional benefits in lower insurance payments or improved safeties.

What just happened is that we created a multi-sided market and it is happening everywhere.

“If you don’t pay for the product, you are the product”

The white paper explains it in more detail but the bottom line is that due to new capabilities in technology, additional data can easily be collected.

This data can be of value for different companies participating in such a data collection and the associated analytics platform.

Based on the economic theory of multisided markets, the different participants can influence each other in a positive way, especially cross sector (the so called network effect).

So there you have it, the simple recipe for a new business model:

  1. Find a place where data is generated. This could be in any business or consumer oriented environment. Understand who is generating the data and why.
  2. Research how: a. that data or the information in that data, can give your business a benefit and b. how data that you own or generate yourself, can enrich the data from the other parties.
  3. Negotiate the usage of the data by yourself or the provisioning of your data to the other parties.

In the end this is about creating multiple win scenarios that are based on bringing multiple data sources together. The manufacturer wins because it improves his product, the service provider wins because it can improve the service and the consumer wins because he is receiving both a better product and a more tailored service.

Some have said that Big Data resembles the gold rush (2) many years ago. Everybody is doing it and it seems very simple; just dig in and find the gold – it was even called ‘data-mining’.

In reality, with data nowadays, it is even better, if you create or participate in the right multi-sided market, that data, and thus the value, will be created for you. 

(1) http://www.cam.ac.uk/research/news/digital-records-could-expose-intimate-details-and-personality-traits-of-millions

(2) http://www.forbes.com/sites/bradpeters/2012/06/21/the-big-data-gold-rush/


This blog post was previously published at http://blog.atos.net/blog/2013/03/18/watch-this-space-a-new-business-model-in-3-easy-steps/


The Data ‘Explosion’ is real

In a recently published research called Ascent Journey 2016, the Atos Scientific Community considers the massive growth in data and storage as an important trend in IT.

Whilst the concept of Big Data has been around for a number of years and is relatively well understood, it is now becoming clear that everything we do is leaving a trail of data that can be analyzed and used.

Examples include the payments we make on a credit card, the books we read on an e-reader and our energy use by driving an electric car. This will lead to a new era of Total Data that, in turn, will lead to new business models, services and economic growth.

We don't yet understand all the implications of this – for businesses and society – but organizations that are able to harness and make sense of the vast quantities of heterogeneous data from disparate sources will gain valuable insights into market trends and opportunities.

An 'Ecosystem' of new management tools is taking shape, covering the various layers of the data stack in the enterprise and capable of delivering a 'Total Data' approach.

The technology that supports the Information Management Lifecycle in the enterprise is going through a profound change, due to the emergence of new solutions, many from open source background (NoSQL databases, Hadoop, analytical tools like R, visualization tools). To enable the 'Total Data' environment, the new technologies need to connect into and partly replace traditional technologies.

  • In some scenarios, data must be obtained, processed and correlated with insights being derived and actions initiated as close to real time as possible.

Yesterday's data is not interesting unless it helps predict tomorrow. Yesterday's traffic report isn't helpful in plotting a journey today unless it is known to represent today's pattern as well, and combined with other data can improve future congestion. Pattern Based Strategy enables huge amounts of historical data to be analyzed for previously invisible patterns. These patterns give us the power to start predicting what is likely to happen in the future, so we can plan and improve, both in real-time and in non-real time scenario-planning. For example, real time predictive analysis will plot the route for transporting donor organs across a city safely and quickly, continuously adapting the route to changes in the traffic patterns as they are happening. Another example is a country to make compliance recommendations (and potentially becoming a legal requirement) to companies for maintenance regimes for their infrastructures or industry plants using analytics on historic data and thus establishing an automated "what are the lessons learned" process.

  • Everything will be digital and everything will be connected.

Everything will be captured; "your life is becoming a video" – you can even replay your actions, thoughts and analyse in various forms and for multiple purposes (see http://quantifiedself.com/ for example); this is not only becoming possible for peoples life's, but anything that can be measured can be tracked, traced and put in a digital context for analysis. The ability of businesses to process this wealth of information is still unclear. What these developments – and others related such as 3-D printers and cognitive computers that will be able to replicate smell and touch for their users – mean for society, laws and concepts such as individual privacy need to be reassessed and will prove a huge challenge to governments, businesses and individuals in the 21st Century; for example long-established laws and concepts such as individual privacy need to be reassessed.

  • After an initial confusion phase, traditional and 'Big Data' orientated approaches to analytics will converge in a unified 'Total Data' platform.

Big data relies on its sister technologies of optimized IT networks, rapid mobilization communication tools and cloud computing. Data Analytics as a Service could emerge from a combination of Big Data, Pattern Based Strategy and Cloud technologies. Business performance can improve in areas such as increased forecasting and enhanced automation capabilities and buildt new business propositions upon the discoveries they can do using Total Data as a source of undiscovered information.

[This blog post is a rewrite of http://blog.atos.net/blog/2013/02/22/the-data-explosion-is-real/?preview=true&preview_id=1555&preview_nonce=6df2f23c80 ]

How big is your robot?

What do you get when you combine cloud computing, social networking, big data and modern day engineering? You get a kick-ass robot. This was my first thought when I finished reading a published whitepaper by the Atos scientific community on the topic of robots.

Central in the paper is the question: “Where is the mind of the future robot?”, and by outlining the concept of a robot that can utilize everything that is available in cyberspace you may find it difficult to answer that question.

Today it is hard to predict where on earth all of the data about you is stored in the cloud and we have never been able to communicate more easily. It is easy to see that robots will be everywhere, able to utilize all available information.
This will lead to a new class in robot persona’s and capabilities.

Once the robot is part of a social network, it could virtually interact with humans as well and thus start truly mimicking human behavior.


When I was (much) younger we had a program on our home computer that was called ‘Eliza’. This program would behave as an electronic psychiatrist. It had some limited learning capabilities and some clever language skills to ‘trick’ you in having an actual conversation.

If you would type things like “I hate talking to a computer”, Eliza would answer with “Hate seems to be important to you, can you explain that?”

If we now multiply the capabilities of this ‘Eliza’ by a thousand or more (using cloud computing scalability) and bring in the analytics of all of your ‘likes’ or ‘diggs’ or even the behaviour of your friends, combined with knowledge about your locations and multiply that by analysing all the things you did 5 years ago, 10 years ago and today …. Well I think you get the picture.

The more a future robot knows or has access to, the more it will be able to fulfil his role in supporting us. This may not sit well with everybody, but if we utilize this capability in a clever way, I believe we can benefit.

Especially if we also take into account that a robot can take different forms, could exist virtually or maybe even be in multiple locations at the same time, with access to the right information and computing power to use that to our benefit. The whitepaper describes some of these scenarios and puts it in the perspective of the role of IT providers and systems integrators.

Based on my reading of the whitepaper I was thinking that maybe the statement ‘I cannot be in two places at the same time’ will soon become a thing of the past.



[This blog post is a repost of http://blog.atos.net/blog/2012/11/26/watch-this-space-how-big-is-your-robot/ ]