Choose your friends wisely

Sharing your personal information with the founders of FaceBook, MySpacePinterest, Friendster, Twitter and LinkedIn is probably something you would think about twice. The association of your private stuff with each of these networks is something you want to take very seriously.

There is an interesting tension between social networks and the concept of Privacy. Not only because some people will share what others will want to keep a secret; also because the social networks love to know more about you and continuously challenge your boundaries.

Let’s face it (pun intended) – the more you share, the more traffic you generate, the more money they make. It is that simple. So when social networks need to ‘take their responsibility’, they are acting against their nature (remember the story of the scorpion that wanted to cross the river?).

“If you are not paying for a product, you are the product being sold”

This tension between your privacy and their business model is described in detail in a recent whitepaper by the Atos Scientific Community (find it here) and they conclude:

“Social networking sites have been traditionally reluctant to take into consideration the data privacy concerns brought up by users and public authorities.”

The paper continues to look into the legal aspects of this subject and describes how we are dealing with the challenge of privacy in social networks. Several examples are cited and explained against the existing rules in Europe and the US.

In addition the paper goes beyond the legal aspects and also explores the technical aspects of privacy in social networks. Most interesting is their observation that there is not a single technology that will support the need for privacy:

“Privacy needs, inside and outside social networks, are quite different and should be tackled using specifically tailored technologies.”

You can imagine that privacy related to personal finance, banking information or on the other hand your holiday pictures are totally different datasets that need a different approach. The whitepaper shows this and explains how a difference can be made; it even explores the possibility of a ‘safe’ social network.

A full analysis is done of several technologies that can support a safer social network and allow for better control by the end-user. Also a word of caution is expressed by the authors on the possibilities of cross authorizing using for example your Facebook account log in on other sites.

Finally the observation is that the social networking domain, in which vendors and end-users struggle to get a grip on privacy, is in fact not ignoring the issue. So there is hope – but that does not change the fact you still need to think twice before you hit ‘Like’.


This blog post is a repost of http://blog.atos.net/sc/2012/10/08/watch-this-space-choose-your-friends-wisely/ 


6 reasons why Open Innovation is happening now

When is the last time you watched Sesame Street?

I was thinking about a wonderful song that has been going around a long time on collaboration and co-operation. (see Sesame Street – "Cooperation Makes It Happen" – first performance in episode 2040 / March 1985).

It seems that this way of creating new things is already promoted very early in our childhood and part of our collective memory of positive actions.


So, if Open Innovation is about collaboration and co-operation, you would think we should be doing it more often? Well, an upcoming whitepaper by the Atos Scientific Community explores this question and comes to some interesting conclusions.

"…innovation in the 21st Century is increasingly open, collaborative, multi-disciplinary, and global, resulting in greater opportunities, and challenges for traditional R&D approaches."


Apparently there are a couple of changes in the current society that enable Open Innovation to be a more natural way of doing things (together).

Firstly there is the increased pressure on cost and the need to go to the market with new products and services much faster than before; having access to a bigger pool of knowledge without having to invest a lot of time building that same pool, supports the adoption of Open Innovation.

"The key limitation of Closed Innovation is the lack of leverage of external knowledge and expertise in unknown emerging fields for internal innovation processes."


Secondly we now see new business models emerging that allow for support of joint development of new products and services. Previously it was quite difficult to do so, but there are more and more support companies that will take care of the groundwork – Amazon for flexible compute power and SalesForce for flexible marketing activities just to name a few well known examples, but there are multiple other services available that take care of the basic services in a flexible, pay-as-you-go, get-the-size-you-need delivery models.

Number three is better approaches to intellectual property and protection of ownership. This takes away the need to keep things secret and allows for more sharing between companies. In addition the understanding and broader agreement on the different ways in which we can apply Open Source is also helping companies to take a more relaxed attitude towards collaboration.

At number four we see the increased capabilities in social networking and easy communication between different locations and companies. Setting up meetings and long distance virtual teams has become much easier.

"The walls of the enterprise are (therefore) no longer solid; ideas can filter into the innovation funnel via a ‘bi-directional, semi-permeable membrane’."


And at number five we see the increased understanding that each company holds the key to greatness hidden somewhere deep in the fabric of the organization, waiting to be discovered. While at the same time we know that we need a fresh new perspective to get this to the surface and we need the additional intelligence of somebody outside that will make it grow.

Or, probably most of all, at number six, the generation that loved, watched and learned from Sesame Street is now grown up; they know about this great song “co-operation … makes it happen” and are now putting it into action.


This blog post is a repost of http://atos.net/en-us/about_us/insights-and-innovation/thought-leadership/bin/wp_open_innovation.htm


The Power of Moving Pictures

When I came home from holiday, I connected my HD video-camera to my computer and was able to publish my recordings to YouTube in just 1 click (publishing on YouTube was actually even easier than putting it to a DVD).

Last month when I wanted to replace the hard disk in my MacBook, I found more than one detailed video on the internet with perfect instructions how to do that.

The Khan Academy has library of over 3,400 videos on everything from arithmetic to physics, finance, and history and hundreds of skills to practice academic lessons available for anybody connected to the internet.

I can set up a video conference with anybody in my company who is online, in less than a minute.

Is it any wonder that video is high on the list of technologies that are important for organizations to communicate, train, support and sell their products and services?

(Increased) revenues are hidden in better communications and in targeted communication on different channels; Television, Internet and in-company broadcasting.

The Atos Scientific Community has researched the importance of video, the major technical and business challenges that industries face, and describes the opportunities opening up for system integrators. The result will be published in an upcoming whitepaper. (here)

Video will become so omnipresent and embedded that it will be the normal medium of communication


According to the research, we will see video technology appear in every part of our communication, supported by increasingly capable mobile devices and better connectivity.

It is even expected that the importance will take over from photography and written text.

Interactive capabilities and integration with social networks provide a large potential market that has not yet been completely monetized

This newly found importance of video is certainly not yet understood by everybody and still some time will pass before the right revenue models are available – most models now are based on counting the amount of views or the potential amount of people that are reached.

Increasingly we will see the value of data (including video) to be calculated against the impact of that data in social networks, the speed of distribution and the amount of response generated.

Last but not least I believe video will grow because it will just simply increase the quality of our communications – people talking with each other on the phone or in instant-messaging (I dare not mention email here…), especially across different continents, are just missing a big element of the part where communication leads to understanding.

True understanding in any conversation only comes from actually seeing you smile.

 



This blog post is a repost of http://blog.atos.net/sc/2012/09/18/watch-this-space-the-power-of-moving-pictures/

 


  

Data-centers may be too hot to handle


[This blog post is a repost of http://blog.atos.net/sc/?p=298 ]


Global warming is certainly a topic that the IT industry should care about, especially if you consider the way we contribute to the rising CO2 levels and power consumption. Good news, though; apparently we do already care about it.After his earlier report in 2008, on August 1st, 2011, a follow up report was presented by Jonathan Koomey, a consulting professor at Stanford University.

In his latest report , Koomey states:

“In summary, the rapid rates of growth in data-center electricity use that prevailed from 2000 to 2005 slowed significantly from 2005 to 2010, yielding total electricity use by data-centers in 2010 of about 1.3% of all electricity use for the world, and 2% of all electricity use for the US.”

Still, that is a lot of power and since we can expect a growth in the amount of data-center space, we need to spend considerable time thinking about further lowering the CO2 footprint of data-centers. A white-paper by the Atos Scientific Community, takes an interesting view to the subject. The authors claim that the two worlds of data-centers and of 'Control and Command' have lived up to now in relatively separate spaces, although:

“… the data-center can be seen as an industrial equipment with processes like electricity management, temperature control, humidity control, physical security, and finally the management of IT equipment themselves…”

After a short description of technology that is deployed in data-centers, it is concluded that:

“…computers kept in the room are not the dumb heaters that the physics rules would describe. They have their own operational rules and constraints, acting at a logical level rather than a physical one, and providing software solutions to manage that logical level. A communication channel between the computers or blades and the Control and Command could be mutually beneficial.”

This introduces a new way to look at datacenters, using end-to-end monitoring to manage the facility as a whole, not as a collection of separate components. The fact that all components interact and should be managed as such opens new ways to bring power consumption under control. A better future is possible, but a lot of work still needs to be done.


The Atos whitepaper can be downloaded here.


Enhanced by Zemanta

Are the financial markets going for an uncontrolled science fiction scenario?

English: Photograph shows stock brokers workin...

Image via Wikipedia


[This blog post is a repost of http://blog.atos.net/sc/?p=273 ]


A storm is coming…” (Sarah Conner in “The Terminator”, 1984).

There is more than one doomsday movie that refers to robots or artificial life forms taking over the world. Skynet, introduced in the movie “The Terminator” (James Cameron, 1984), is such a computer based entity that, in the end, almost destroys all of mankind, while it was built with the best intentions.

This scenario came to my mind when I was reading a whitepaper the Atos Scientific Community. In this paper, called “Computational Finance”, the authors show that;

“…New mathematical algorithms, the latest high-performance computer systems and high-frequency trading(HFT) are taking over from human stockbrokers “

On top of this, the effect on the financial markets stability is not fully understood, leading to extreme volatility of the large stock exchanges and their indexes. That does not sound good. Putting our current predicament in an historic perspective:

“In the early 2000s, the biggest banks and hedge funds developed and expanded algorithmic trading. Complex investment strategies and orders, which would have previously needed several days to process and experienced specialists to process them, could be settled in minutes or hours, with almost no human interaction. “

And apparently it did not stop there, because nowadays even more complex methodologies are introduced:

“The current focus is on so-called high-frequency trading (HFT), a sub-class of algorithmic trading strategies aimed at taking advantage of high-performance computing and low-latency communication networks. Hundreds of stocks are bought and sold in the blink of an eye, with some orders lasting only a handful of microseconds. “

Clearly, in a collaborative or competitive arena, where timing of decisions makes a big difference, it is easily understood why such an approach on technology is embraced; and research shows that it is now commonplace:

It is estimated that at the New York Stock Exchange (NYSE), more than 60 percent of trades are currently carried out without direct human intervention.

By now I am getting worried, because without proper understanding of the relationships between the complex algorithms and interdependencies in the program, we could end up in a situation that is… let’s call it “as of yet undefined”. A situation that apparently already occurred, although on a small scale with only one index:

“Recent events, such as the May 6 2010 stock market crash, when the DOW Jones took the single largest plunge in its history, made the trend for high-performance computing headline news and led people, from the simple newspaper reader to regulatory institutions, to try to figure out just how prevalent HFT really is. The fact is, no one really knows.”

Again, not very reassuring. The current situation can be analyzed in great detail to explain the different financial models and mathematical concepts currently being used and also the technology that supports this progress. The conclusion is crystal clear:

“In order to properly address this issue, it is required to fully assess the current state of art of both new mathematical concepts (used for modeling, forecasting and risk assessment) and the latest associated technological solutions (.e.g. High performance / low latency computing).”

And

With the combination of new Market expectations, Advanced mathematical models and high performance computing leads to new business opportunities: banks & trading agencies, Stock Exchanges, and regulation authorities will new a new set of services, from real-time information flow using sentiment analysis to Intelligent Watchdogs for early anomaly detection.

So, while the whitepaper is an interesting read of where we are and why all this technology is used in the stock markets, it also cries out that we need to get proactive in understanding what we are doing in order to avoid future catastrophic events. What do you think; are we re-enacting ‘The Boy Who Cried Wolf’, or is Skynet already active?


The whitepaper can be downloaded here


Enhanced by Zemanta