Data-centers may be too hot to handle

[This blog post is a repost of ]

Global warming is certainly a topic that the IT industry should care about, especially if you consider the way we contribute to the rising CO2 levels and power consumption. Good news, though; apparently we do already care about it.After his earlier report in 2008, on August 1st, 2011, a follow up report was presented by Jonathan Koomey, a consulting professor at Stanford University.

In his latest report , Koomey states:

“In summary, the rapid rates of growth in data-center electricity use that prevailed from 2000 to 2005 slowed significantly from 2005 to 2010, yielding total electricity use by data-centers in 2010 of about 1.3% of all electricity use for the world, and 2% of all electricity use for the US.”

Still, that is a lot of power and since we can expect a growth in the amount of data-center space, we need to spend considerable time thinking about further lowering the CO2 footprint of data-centers. A white-paper by the Atos Scientific Community, takes an interesting view to the subject. The authors claim that the two worlds of data-centers and of 'Control and Command' have lived up to now in relatively separate spaces, although:

“… the data-center can be seen as an industrial equipment with processes like electricity management, temperature control, humidity control, physical security, and finally the management of IT equipment themselves…”

After a short description of technology that is deployed in data-centers, it is concluded that:

“…computers kept in the room are not the dumb heaters that the physics rules would describe. They have their own operational rules and constraints, acting at a logical level rather than a physical one, and providing software solutions to manage that logical level. A communication channel between the computers or blades and the Control and Command could be mutually beneficial.”

This introduces a new way to look at datacenters, using end-to-end monitoring to manage the facility as a whole, not as a collection of separate components. The fact that all components interact and should be managed as such opens new ways to bring power consumption under control. A better future is possible, but a lot of work still needs to be done.

The Atos whitepaper can be downloaded here.

Enhanced by Zemanta

Are the financial markets going for an uncontrolled science fiction scenario?

English: Photograph shows stock brokers workin...

Image via Wikipedia

[This blog post is a repost of ]

A storm is coming…” (Sarah Conner in “The Terminator”, 1984).

There is more than one doomsday movie that refers to robots or artificial life forms taking over the world. Skynet, introduced in the movie “The Terminator” (James Cameron, 1984), is such a computer based entity that, in the end, almost destroys all of mankind, while it was built with the best intentions.

This scenario came to my mind when I was reading a whitepaper the Atos Scientific Community. In this paper, called “Computational Finance”, the authors show that;

“…New mathematical algorithms, the latest high-performance computer systems and high-frequency trading(HFT) are taking over from human stockbrokers “

On top of this, the effect on the financial markets stability is not fully understood, leading to extreme volatility of the large stock exchanges and their indexes. That does not sound good. Putting our current predicament in an historic perspective:

“In the early 2000s, the biggest banks and hedge funds developed and expanded algorithmic trading. Complex investment strategies and orders, which would have previously needed several days to process and experienced specialists to process them, could be settled in minutes or hours, with almost no human interaction. “

And apparently it did not stop there, because nowadays even more complex methodologies are introduced:

“The current focus is on so-called high-frequency trading (HFT), a sub-class of algorithmic trading strategies aimed at taking advantage of high-performance computing and low-latency communication networks. Hundreds of stocks are bought and sold in the blink of an eye, with some orders lasting only a handful of microseconds. “

Clearly, in a collaborative or competitive arena, where timing of decisions makes a big difference, it is easily understood why such an approach on technology is embraced; and research shows that it is now commonplace:

It is estimated that at the New York Stock Exchange (NYSE), more than 60 percent of trades are currently carried out without direct human intervention.

By now I am getting worried, because without proper understanding of the relationships between the complex algorithms and interdependencies in the program, we could end up in a situation that is… let’s call it “as of yet undefined”. A situation that apparently already occurred, although on a small scale with only one index:

“Recent events, such as the May 6 2010 stock market crash, when the DOW Jones took the single largest plunge in its history, made the trend for high-performance computing headline news and led people, from the simple newspaper reader to regulatory institutions, to try to figure out just how prevalent HFT really is. The fact is, no one really knows.”

Again, not very reassuring. The current situation can be analyzed in great detail to explain the different financial models and mathematical concepts currently being used and also the technology that supports this progress. The conclusion is crystal clear:

“In order to properly address this issue, it is required to fully assess the current state of art of both new mathematical concepts (used for modeling, forecasting and risk assessment) and the latest associated technological solutions (.e.g. High performance / low latency computing).”


With the combination of new Market expectations, Advanced mathematical models and high performance computing leads to new business opportunities: banks & trading agencies, Stock Exchanges, and regulation authorities will new a new set of services, from real-time information flow using sentiment analysis to Intelligent Watchdogs for early anomaly detection.

So, while the whitepaper is an interesting read of where we are and why all this technology is used in the stock markets, it also cries out that we need to get proactive in understanding what we are doing in order to avoid future catastrophic events. What do you think; are we re-enacting ‘The Boy Who Cried Wolf’, or is Skynet already active?

The whitepaper can be downloaded here

Enhanced by Zemanta