Adobe MAX Day 2: Sneak Peeks

The Sneak Peeks culminated a educational second day of Adobe MAX.  For those of you not familiar with the “Sneaks”, this is Adobe’s opportunity to showcase some experimental technologies they are working on in their secret underground labs.  The disclaimer is these experiments may never actually make it into the products.  So don’t be disappointed if they never make it into the products.  Below is a quick summary of what was showcased.  For the record, Leonard Nimoy was slated to co-host, but fell ill.  Who did that get to cover?  William freaking Shatner!  Yeah, it was a nice surprise.  He was hillarious.

Sneak Peeks

  • Flash to HTML5:  They demonstrated the ability to convert Flash animations to HTML5.  This would make it very easy to create alternate content with the richness of Flash or let you use Flash to create complicated HTML animations.
  • Pixel Bender 3D:  This adds native 3D support for Pixel Bender so you can create some very cool effects.
  • Live Flex Design Development:  This is “Live View” for the design mode in Flex.  This would make Flash Builder render your application in design mode so you can make changes “on the fly” and without recompiling.
  • Video Tapestries:  Not familiar with a tapestry?  Well this feature would make a large scrolling image of different frames in the video.  The idea is to replace a seek bar (which doesn’t really give you any information).  Instead, you see a collage of images which are generated on the fly from the video.
  • Flash CPU Performance:  They demonstrated the Flash Player running on both OSX and Windows using 100% GPU.  This drastically reduced the CPU usage of the Flash Player.  They also demonstrated a 4K video (over twice the size of standard 1080p video) running fullscreen and only using about 8% of the CPU.  Crazy.
  • ColdFusion/Multiscreen:  This showed some native abilities of ColdFusion to render content differently based on the device view it.  For example, a basic for created for searching a page would be automatically adjusted for the different screens (instead of the developer needing to make a “mobile” version of the form)
  • Photoshop:  There were two very experiment technologies.  The first was for adjusting photographs.  Instead of using levels/curves/brightness/contrast like you normally would, you just tell the picture to auto-adjust.  The amazing thing about this feature is in how it adjust the picture.  It looked like it was professionally taken.  All the lighting was perfect.  They also showed choosing another photo and having the existing for adjusted to match.  The simple example of this is to choose a sepia tone picture.  Photoshop looks and that picture, see the sepia tone, and applies it to your photo.
  • The other Photoshop feature was sharpening.  They have some crazy algorithms that look at a blurred photo and determine how the camera moved during the shot.  It then adjusts all the pixels back accordingly.  They results were WAY better than the unsharp mask currently in the product.

Those were a few of the sneaks that were announced.  Keep watching labs.adobe.com for more preview software.

Update: You can view individual videos of all the sneaks here.

Comments are closed.