Thankfully we are now past the stage of "megapixel bragging"; every camera today has more than enough megapixels to do the job. Image stabilization (IS) is the most significant improvement in photography in recent years. If you've never experienced it, hold a camera zoomed to 300+ mm and see how different the image is with and without IS turned on. But one thing that still plagues amateur and professional alike is the uneven distribution of light in many images.
On a sunny day, for example, the bright sky causes the camera to close down the aperture and underexpose the subject. Most people learn they can set the camera to allow in more light - back light compensation - but then the nice bright blue sky turns almost white. The highlights are said to be blown out. It turns out our eyes can see a wider range of light intensity than a camera.
Professional digital photographers have a solution. They spend money on expensive cameras, spend more money on Photoshop and then spend time combining three separate images - one standard exposure, one over exposed and one underexposed. High end cameras can easily produce this set of images. This is not an approach most people would want to try but the results are amazing; details are present in the bright, dark and intermediate areas of the picture.
On September 1, 2010 Apple announced an upgrade to their iOS software - used in iPhones, iPods and iPads. This change allows the camera to take the necessary three images and automatically combine them into an HDR photo. The results looked impressive in the demonstration but I'll have to see real results before I believe it. This approach does not require anything special in camera electronics or optics since the process is just rapid image capture followed by sophisticated image processing. Since all smartphones and the upcoming hoard of tablets generally include cameras, it is likely other vendors will implement similar capabilities that will produce better images and require little if any change in user behavior.