I need your help with this nagging question about the different between cloud and virtualization.
There are two large hypes today in computing:
- Virtualization – essentially, the ability to treat machines as virtual entities and run their workloads on shared physical hardware
- Cloud – can’t even begin to give a specific definition that everyone will agree with…
I had this notion that a prerequisite for cloud is virtualization. It seems a reasonable step: if you want to scale things horizontally, and be able to load balance a large operation, then it is a lot easier to do by virtualizing everything.
And then people start coming up with nagging examples like Google’s services, Facebook and Twitter. All are undoubtedly doing cloud (the thing I refrained from defining), but none use virtualization.
It seems like cloud is essential in almost any type of service (not this blog site mind you – at least not until you bring all your friends, family, acquaintances, neighbors and colleagues to read it on a daily basis).
Anyway, here are some questions I have – I am relying on your collective experience – especially those coming from companies that deploy services (with or without WebRTC):
- How exactly are cloud and virtualization different? Where would you put the single most distinct difference?
- When would you use cloud technologies but skip virtualization?
- In which types of companies would it make sense to use cloud without virtualization? Is it small companies? Large ones? Technology focused companies? Something else?
- Would you say that today any service should run in a cloud that has virtualization unless proven otherwise? Or is it the other way around?
Keep the answers coming – I am appreciative of them already.