Here is a video providing an overview of the application:
One of it's biggest features is the ability to take photos and refocus them after, similar to how the Lytro works. Instead of capturing light field information, it prompts the user to slowly slide their phone upwards to capture additional scene information. It then uses structure from motion and stereo vision algorithms to computationally determine how far away different parts of the scene are from the camera. This is called a depth map. If you want to read further on the topic the Google Research Blog has a great write-up on it.
Once you have the depth information of a scene you can do some fun stuff with it. Pictur3D does a rather simple thing with the depth map: it translates the original image's color data into 3D space, and enables you to play with the color point cloud field.
If you're interested, please feel free to play with the application. It still has some rough edges, and some more cool features are coming soon!
Also if you find that Lens Blur images you took today aren't showing up in the app, an update (0.0.2) to support the new Google Camera format has already been pushed, but may take a couple hours to propagate through the Play Store. Enjoy!