To paraphrase Brian Goetz's excellent The Concurrency Revolution: The Hardware Story talk at Devoxx09: "concurrency hides latency". In other words: use concurrency to combat latency. This really seems to be the common theme in current-day software development.
In his talk, Brian explained how CPU speeds have increased at a much faster pace than memory speeds the last few decades. This has resulted in a situation where going to main memory is now extremely expensive in terms of CPU time. It could take several hundred clock cycles to fetch some data from main memory. Brian captured the problem in anther very quotable statement: "memory is the new disk". Hardware designers have been trying to minimize the impact of memory latency on computer performance by clever tricks like speculative execution and extensive caching. However, we've now come to a point where these techniques are breaking down because the gap between CPU and memory speed is just too big. As a result we're seeing more and more multi-core CPUs: slower in absolute terms but designed for concurrency so latency is less of a problem.
In web applications, a round-trip to the server involves a lot of latency, so web applications are doing more things on the client in JavaScript and hide this latency by concurrently doing other things: think AJAX. In a typical back-end business processing system, a round-trip to the database also involves a lot of latency, so we use techniques like event driven architectures to process several transactions concurrently, again hiding the latency cost.
More and more performance is about data, not code (another quote from Brian's talk)! I've seen several situations where the efficiency of a particular piece of software is completely determined by it's data access strategy. Relatively speaking, fetching data from the database is so expensive that it doesn't matter that you process that data in a naive or unoptimized way. All of this of course also implies that a key design question is the data access patterns and data structures used by the application. If you want a fast application, data is your key concern, and you can fight the latency of getting to the data by using concurrency.
Sunday, December 13, 2009
Subscribe to:
Posts (Atom)