7 ways Apple, Samsung and Google need to step up their phone cameras in 2019

pixel-3-night-sight
macro-iphone-xs-5154

The camera on your phone is pretty spectacular. In an instant, it can capture a milestone in your child’s life, a night out with friends or that particularly attractive slice of avocado toast in front of you. 

Camera technology on phones has made giants jumps in innovation over just a few years. There are plenty of phones that take outstanding photos. Huawei P20 Pro and Mate 20 Pro use triple rear cameras; the Samsung Galaxy S9 and Note 9 have mechanical dual-aperture; the Pixel 3 is equipped with AI muscle and the iPhone XR and XS wowed us. But nearly all of these phones are lacking in one area: video.

Phones can produce incredible photos and benefit from compelling features like portrait mode. But video shot on phones still lags dramatically behind in terms of image quality, dynamic range and low-light performance.

It’s time for video from our phones to reach the same level of quality as the photos we take. Here are some ways companies can improve the way phones record video and a few video features that might be nice add-ons, too.

Android phones need better processors for video

The iPhone XR, XS and XS Max capture the best overall videos on any phone you can buy. This comes in part from Apple’s A12 chipset and ‘Neural Engine,’ which speedily process footage.

Meanwhile, Qualcomm, which makes processors for Android phones such as the upcoming Galaxy S10, recently launched the new Snapdragon 855 chipset. CNET tested the new chip and found it was 45 percent faster than last year’s Snapdragon 845 processor. This should help provide some extra muscle for Android phones.

It’s only a matter of time before phones wrangle enough power to record and optimize video via AI and machine learning. Google is in the best position to do this since it already uses AI and machine learning to improve photos. Needless to say, video would require a phone to handle much more data compared to a single photo, and adding that extra burden of ‘enhancing’ video footage quickly would require a level of power a couple of generations away.

Add bokeh and portrait mode to video

Portrait mode stands out as one of the most exciting features on phones, but it’s only available for photos. Qualcomm claims that its new chip supports portrait mode for video — something even the newest iPhones don’t have.

Portrait video mode would be a fun feature, but I wonder how well it would work. Flagship phones are in their third generation of portrait mode for photos and some results are still hit-or-miss. One can only imagine what the early, rough days of portrait mode video would look like.

If you have an iPhone with portrait mode, you can get a taste of what a video version might look like by watching the live preview of the effect before you snap a photo.

Improve low-light video like the iPhone’s Auto Low Light FPS

One of the most underrated features on the iPhone XR, XS and XS Max is Auto Low Light FPS. It’s something you can enable in the camera settings and it reduces the frame rate in low-light to improve video quality. The idea is to let each frame gather a little more light and the result can be amazing.

Samsung’s current flagship phones can switch to a wider aperture. This also lets in more light during low-light video recording, but the results are just okay. Other than that though, there isn’t another Android phone that we know of that has a video feature specifically for overcoming the challenges of low light.

pixel-3-night-sight

The Pixel 3, Huawei P20 Pro and Mate 20 Pro have a low-light photo mode that compiles multiple photos into a single picture. It would be interesting to see if Google or Huawei find a similar feature for video. In Huawei’s case, it could use exposure information from its other rear cameras and combine it with the video from the main camera to create a beautiful low-light video.

Good video needs great audio

It might sound odd, but video is only as good as its audio. Bad audio can make a video unwatchable and that’s something many phones today are shockingly guilty of. I’m looking at you, OnePlus 6T and Pixel 3. It’s not that the microphones on the phones are bad, it’s that they’ll never be close enough to the subject to pick up good sound. Think about how a news reporter in the field holds a stick mic to pick up better-quality audio.

If you don’t have your own audio crew and boom mic operator, perhaps the answer lies in software and processing. Phones could use AI or machine learning to improve and optimize the audio on videos. In a way, that’s what noise-cancelling phone mics already do.

Another approach would be adding more microphones and then the phone would mix and optimize recordings into a single track.

If you’re looking for professional quality audio, there are a number of shotgun mic accessories available, not to mention wireless lavalier mics designed to work with phones.

Refine the quality of slow motion video

It is bonkers to think that a phone can record 1080p video at 240fps when even high-end video cameras top out at just 120fps. But not all slow motion is created equal. When looking at 1080p 240fps slow motion recordings from both the iPhone XS and the Galaxy Note 9 there are noticeable differences. The iPhone has better image quality and a wider dynamic range than the Note 9.

Companies like Samsung, Huawei, Sony and OnePlus have a ‘super’ slow motion mode that captures 480 or even 960fps. But these videos are recorded at a lower resolution and look pretty bad except in the most optimal of situations.

Slow motion video is another area where processing power can drastically help improve image quality. AI and machine learning might also help slow motion footage look more dramatic without sacrificing resolution.

Add Hitchcock’s dolly zoom effect

Portrait mode really showed off the way a phone can mimic an expensive professional stills camera. Phone makers should add a video-specific feature like the dolly zoom effect that you see in movies such as Vertigo, Jaws and Ratatouille — it’s also known as the Vertigo effect.

This happens when a camera lens zooms in or out while the physical camera moves in the opposite direction. With so many phones adding a second camera for zoom, it’s possible that Apple or Samsung could come up with a way to mimic this effect by using video captured by dual rear cameras. I’m sure the moment phones can pull off the dolly effect, Instagram will be brimming with videos using it.

LG added the ‘Point Zoom’ feature to its V30 in 2017, allowing for a slow and steady cinematic zoom into a specific target. My colleague Scott Stein was enamored with the feature because it reminded him of the slow zooms used on Curb Your Enthusiasm.

Oops! You accidentally cut off grandma’s head

Since we’re dreaming big, it’d be great if phones could undo framing mistakes while filming video. If you accidentally cut someone’s head out of a shot, for example, your phone could ‘magically’ add it back in using information it gathered from its second camera.

I know this sounds crazy, but it would be justifiably useful. The same approach could be used to get rid of an errant thumb or finger that accidentally made its way into frame while you filmed.

Phone cameras need to make serious strides in terms of video quality and features. The solution lies in a mix of more powerful processors alongside AI and machine learning to optimize video beyond what a sensor can capture. If the improvements phone makers made to photos are any indication, there will be plenty of awesome video features available down the road.

Leave a Reply

Your email address will not be published. Required fields are marked *

*