The rise of the democriticians
commonly gets credit for being the first person to speculate that matter is really all made up of super small indivisible particles: atoms. In information theory, it is common to think of a triple
consisting of a subject, a predicate and an object as being an atom of information. Democriticians hold that the best way to create data models is to start at this triple level and build everything up from there. The idea has a long, long history. It can be argued that Prolog
explored this approach in the Seventies. It can also be argued that the CODASYL
model popular in the days of COBOL - with its network approach to data modelling - also covered this territory. Indeed, the philosopher C.S. Pierce
was arguably drawing RDF diagrams with a quill pen back in 1885.
The rise of the parallelizers
It is pretty evident I think that we are entering the age of the parallel – whether we like it or not. It seems that Moores law is slowing down. Individual chips are not getting faster at the rate we have become used to. Instead, there are more and more of them crammed into each chip. The term “CPU” is becoming increasingly inaccurate. The Von Neumann architecture
that has served so long as the fundamental absatraction no longer fits the facts. The facts increasingly consist of umpteen virtualized machines, bottomless pits of storage and an increasingly "on demand" approach to compute resource allocation.
It is true that some developers creating cloud computing platforms, or utilizing the cloud ecosystems being created by companies such as IBM
, start by firing up a relational database but a goodly number are jumping straight for designs based on MapReduce
to name but three.