Making Contextual Sense Just Became a Hardware Story

October 4, 2012

The more we do on hardware the better. What happens when we start moving our context into hardware?

When you look at the roadmap of server computing, you will find in it 3 main topics:

  1. Virtualization
  2. General purpose (leading to commoditization)
  3. Low power (taken from the mobile book)

The mobile side of the story is drastically different: it has been investing in low power for years now and adding virtualization in the last couple of years. But it is definitely moving as far away as possible from being general purpose.

To solve power consumption and computing requirements, mobile chipset vendors have gone to the specialization route: they now reduce power by shutting down pieces of the chip that are not used at a given point in time and even reduce clock speeds for different cores based on their dynamic workloads. At the same time, they have taken every piece of activity that can be accelerated – and accelerated it.

Today, anything related to multimedia on a smartphone is accelerated. So is networking and communication. And now, Qualcomm came out a few months ago with an SDK called Gimbal, which tries to make contextual sense from collecting hardware information from the device.

Robert Scoble has a nice piece about Gimbal, with a plethora of use cases for it.

http://www.youtube.com/watch?v=H-D3tT6TyF0

Context is a big problem. I'd even say it is a Big Data problem. At the end of the day, it will probably require processing on the server side, using Hadoop or other large scale NoSQL systems; but it will also require a lot of processing power and smarts on the end devices. An SDK such as Gimbal that does the heavy lifting of accessing the hardware in ways that preserves battery life is a first step towards the goal of context aware smartphones. It will be interesting to see how this develops in the coming years

This only strengthens my earlier sentiment: whatever can be shifted into hardware will be. You start by writing code in software, continue with accelerating parts of it using hardware, and end up in putting most of that code into hardware and just controlling it with software.

I just wonder when this kind of hardware acceleration will find its way to server technologies for things like databases.


You may also like