Recently I decided to take a look at Docker, which looks to be a pretty nifty new development in virutalization on Linux. Instead of creating an entire virtual computer that has its own copy of an operating system, assigned ram, and diskspace, as with VirtualBox, Docker pares the virtualization down to the bare minimum.
Take an application and whatever resources it relies on (say a specific version of Python and a handful of libraries) and put them in a container (think zip file).
Load up the Docker software on a 3.8 or newer Linux kernel (Debian or Red Hat based linux currently, but it sounds like it may eventually be available on OSX, BSDs, and other Unixes eventually).
The Docker engine and operating system work together to run the software in the container in isolation from the rest of the computer. If the containerized software crashes, it won't take down the whole box. Some bug makes it think it should delete everything? It only touches what it had access to, not your whole hard drive.
Extra feature: when you load the container it's actually used as a master copy to produce a running copy from. So if the containerized app decided to erase itself, the active copy is gone but the original container file is still around.
Also cool: Docker's assembled from standardized parts. The Docker folks didn't re-invent wheels unnecessarily.
What kind of awesomeness might Docker bring to us in the future? Imagine you've got a website and want to add a forum to it. Instead of needing to check for a specific version of PHP or Python or Ruby or New-Awesome-Web-Language-of-the-Month, bugging your host because they've got too old a version of the language installed, then getting a database configured, then steps x, y, and z other preliminaries -- simply place Awesome-Forum-Software.dockerfile on your server and tell it to run. Who cares what version of PHP your web server has, the programmers who built the forum included the exact version they built and tested it with in the container. Database engine? The one they built and tested the forum with is in the container. You just put the container on the server and tell it to start up.
Meanwhile the operating system only has to set aside the resources the application is requiring. No need to virtualize an entire computer to run one application, just whatever the specific soft of software in the container needs.
Fanboy crushing time: Containers aren't new. This isn't a new awesome thing that's never existed before. Docker's essentially the original IBM PC of containers. It took something that existed in proprientary forms and assembled it out of commodity parts and some open sourced programming and standardizing to tie the parts together. This is praiseworthy, it's awesome, but what's new is not that it can be done but that now doing it is open to all.