Posted on Leave a comment

Topaz Labs Image Quality Bundle: A Photographer’s Lifeline

A photo of a wild cat looking back at the camera

A photo of a wild cat looking back at the camera

The most fun I have with photography is capturing something I’ve never seen before. Of course, that sets up a lot of pressure to come away with tack-sharp perfect photos. Topaz Labs is how I ensure my success.

Topaz Labs’ most popular Image Quality Bundle includes Sharpen AI, DeNoise AI, and Gigapixel AI normally costs $259.97. But now through Monday, November 29, the company has an even bigger Black Friday discount on the software: the Image Quality bundle can be picked up for $99.99, which is a 60% savings off its regular retail price. If you’re into video work, you can get their “Everything Bundle,” which includes these three photo apps and Video Enhance AI for just $199.98 (normally $559.96). If bundles aren’t for you, you can still get 25% off individual licenses of DeNoise AI, Sharpen AI, and Gigapixel AI. Additionally, Video Enhance AI is available for $99.99, a 67% savings from the regular price of $299.99.

In this article sponsored by Topaz Labs, I want to showcase how I used all three of these applications to enhance one of my wildlife photos. For me, the image of the bobcat I’ll be editing below is exciting as not only was it the first I’d photographed, but the first one I’ve seen in the wild. It’s these kinds of special, personal one-of-a-kind photos that make Topaz Labs products worth every penny because no matter what I walked away with in the field I can fix many issues that may pop up under closer examination.

Bobcat straight out of camera photographed with the Sony a7R III and Sony 200-600mm.
Bobcat straight out of camera photographed with the Sony a7R III and Sony 200-600mm.
After editing in Capture One and now ready for enhancing with the Topaz Labs Image Quality Bundle.
After editing in Capture One and now ready for enhancement with the Topaz Labs Image Quality Bundle.

The first step was to take care of all the basic adjustments needed inside Capture One. This included a good-sized crop, some work with exposure and creating gradient masks to control the light, plus some color alterations to taste. The “after” shown above is what I came up with, and from Capture One I sent the image file out to Photoshop where I work on finishing and using Topaz Labs Sharpen AI and DeNoise AI as plugins.

Topaz Labs Sharpen AI

The first issue I want to tackle with the image is the soft focus on the animal. This was a completely unexpected sighting taken from a car on the road, and the cat had been already walking down this path by the time I saw it. Between heat haze and some slight motion blur, I know there’s some more detail that can be squeezed out in Sharpen AI.

Sharpen AI settings.
Sharpen AI settings.

Inside Sharpen AI, the interface is very straightforward. Looking at the side panel, I just go in order of the settings as shown and refine them to taste. First is selecting which Sharpen Model to use. Since this has a little bit of motion blur, “Motion Blur – Normal” seemed to work best on the bobcat’s face which is the most important part of the image.

However, because of the heat haze I mentioned, the out-of-focus areas in the image have some unpleasantly hard-edged bokeh mixed around there. It’s confusing the program as to whether to not it should be enhancing those edges, and in this case, I do not. Simple enough, I can click into the “Select” panel where it automatically masked the bobcat for me almost perfectly. Using a low-opacity brush, I made a couple of tweaks to the mask to control how much sharpening was being applied to different spots around the cat; all sharpening focused on the head with less at the rear end. Now I press Apply which saves the changes to my active layer in Photoshop.

Automatic subject detection masking in Sharpen AI.
Automatic subject detection masking in Sharpen AI.
Final masking tweaks and ready to save out to Photoshop.
Final masking tweaks and ready to save out to Photoshop.
Before Sharpen AI.
Before Sharpen AI.
After Sharpen AI.
After Sharpen AI.

Topaz Labs DeNoise AI

Next, I want to work inside DeNoise AI to deal with the problematic heat distortion. Earlier this week, I detailed in an article how well DeNoise AI combats noise while retaining details. Today, I want to share another way I use it. I’ve found that not only does DeNoise AI do wonders to bring out detail amongst the noise, but it can also help to naturally wash away the areas you don’t want to be detailed — of course, while providing noise reduction at the same time.

DeNoise AI settings.
DeNoise AI settings.

To do this, inside DeNoise AI I’m looking at either Low Light or Severe Noise in the AI Model panel since these two allow for zero sharpening to be applied. Remember, we want to reduce the heat distortion artifacts, not enhance them. With Severe Noise selected, I slide Enhance Sharpness and Recover Original Detail to zero. From here I adjust Remove Noise to taste finding a balance between losing the strange hard edging but not allowing it to get overly smeary.

Surprisingly, doing this did not affect the bobcat as much as I thought it would, but I still went ahead and made a mask on the bobcat and then inverted it so everything in the photo was targeted with noise reduction except the animal.

Masking inside DeNoise AI.
Masking inside DeNoise AI.
Before DeNoise AI.
Before DeNoise AI.
After DeNoise AI.
After DeNoise AI, softening the unusual hard edges in the out of focus areas without looking obvious.

Topaz Labs Gigapixel AI

After completing my edits with Sharpen AI and DeNoise AI, along with anything else I wanted to clone or fix inside Photoshop, it’s time to finish this in Gigapixel AI. As we saw at the beginning, the photo was heavily cropped from 42 megapixels down to now 6 megapixels. I really like this image, so I want to put it big on my wall. Enlarging it nearly 6 times so that it’s 60 inches wide and 40 inches tall at 300 dots per inch ought to do it.

Gigapixel AI settings.
Gigapixel AI settings.

Inside Gigapixel AI, I set the Resize Mode to Width and input 60 inches. With the AI Model kept on Standard, it’s that easy to greatly enlarge a photo for printing. Comparing the result from Gigapixel AI to Photoshop, I see much finer details present Gigapixel AI. The application leverages its machine learning to interpret what is being enlarged and can finesse these small details and keep them from aliasing or becoming smudgy. In comparison, the Photoshop result looks blocky and has obviously run up against its limitations.

Enlarged 5.88x to 60x40 inches using Photoshop.
Enlarged 5.88x to 60×40 inches using Photoshop.
Enlarged 5.88x to 60x40 inches using Gigapixel AI.
Enlarged 5.88x to 60×40 inches using Gigapixel AI.

An Essential Set of Tools for All Photographers

The Topaz Labs Image Quality Bundle features three photo editing applications that I would never want to go without. They have all changed the way I approach both photography in the field and editing at the computer. Disguised by a straightforward interface and simple slider controls, the power and capability within them are well-worth checking out for yourself.


Welcome to a PetaPixel Showcase, in which our staff gives you a hands-on with unique and interesting products from across the photography landscape. The Showcase format affords manufacturers the opportunity to sponsor hands-on time with their products and our staff and lets them highlight what features they think are worth noting, but the opinions expressed from PetaPixel staff are genuine. Showcases should not be considered an endorsement by PetaPixel.

Source link

Posted on Leave a comment

DXOMARK tests image quality of video doorbells

DXOMARK tests image quality of video doorbells

Around the time that  DXOMARK was beginning to study the evaluation of image quality of home surveillance cameras, one of its new employees installed a video-camera system in his new apartment. The employee was new to the Paris region, and he was eager to try out a home security system, especially since he lived on the ground floor of an apartment complex. It quickly turned out that having the security system at home was a smart decision.
About a week after installation, the employee witnessed his apartment being burgled, live on his smartphone — he even had a conversation with the intruder! And thanks to the surveillance camera, it was all caught on video. The police intervened, and although the intruder was never caught, a video of him in the apartment was kept as evidence.

With incidents like that, it’s not hard to see why home security cameras are a big and growing business. According to a KBV Research market report, the global smart home security cameras market size is expected to reach $10.4 billion by 2026, rising at a market growth rate of 16.5% (compounded annual growth rate) during the forecast period. Security issues in developing countries are a key driver of the growth of the home security camera market, according to the research.

In an era of connectivity, this growth is in parallel with the growing number of smart home devices hitting the market every day. Most security cameras, for example, can work with Amazon Alexa, Google Assistant, and other virtual assistants.
The main reason for getting a home security camera system is to deter the possibility of a home invasion. Burglars are less likely to try to break into a house knowing that their actions are being recorded or being broadcast on a smartphone or tablet.

DXOMARK tests image quality of video doorbells 1

Indoor cameras help keep tabs on what is going on at home.

DXOMARK tests image quality of video doorbells 2

Home security cameras can help identify intruders.

While home security is the main use case for surveillance cameras, there are several secondary use cases that make a home-camera system practical to have. One particularly common use case is being able to interact remotely with a delivery person, or check on delivered packages that get left at the door. Another secondary use case is pet watching.  Now that many workers are returning to the office after having worked from home because of COVID, pets who had become accustomed to having their favorite humans at home might begin to suffer from separation anxiety. An indoor home camera system could help people check in on their pets during the workday, and they can even talk to them via the sound system. Another use case is to monitor who enters the home during the day, whether it’s the kids coming home from school with their friends or the cleaning lady.

Types of home security cameras

The main types of home security cameras are outdoor, indoor, and doorbell cameras.  Outdoor cameras monitor the activity in the yard or the driveway, while doorbell cameras allow you to see and even talk to the person at your door, whether it is a delivery person or an unexpected visitor,  even if you are not home. Indoor cameras record activities in the rooms they are placed in, such as the living room, kitchen, or bedroom—or anywhere a home intruder might be able to enter from the outside.

The choice of which type of camera to get depends on its intended use. But in general, there are some main common qualities of any type of home security camera. They must provide:

  • Good detail rendering at long distances, in the case of outdoor cameras, and at short distances, in the case of indoor and doorbell cameras.
  • Wide dynamic range
  • Large fields of view
  • Night vision
  • Audio capabilities, particularly for indoor and doorbell cameras

Challenges and limitations

Home security cameras are usually small so that they can be discreet and unobtrusive. The sensors are about the same size as those found in smartphones, meaning they are much smaller than what you find in  DSLR cameras. This presents some major challenges and limitations when it comes to image quality. Small apertures limit the flow of light to the sensor, which affects exposure and dynamic range. Small sensors mean less light is captured, which could affect the final image.

Another challenge, especially for outdoor cameras, is capturing a wide field of view.  Moreover, they use fisheyes lenses, which have smaller apertures,  to cover a wider field of view.  Fisheye lenses can sometimes create artifacts like distortion and color fringing.

The third challenge is detection, and image quality plays a role in this. The most important and basic function of home security cameras is that they must be able to quickly and accurately detect relevant motion in a use case such as a home invasion. The keyword here is relevant, and image quality from the camera is crucial in this case. If the image quality is poor, the security system could easily send too many false alerts, or worse, it might not send any at all because it did not detect anything.

The way a home security camera is installed also presents a challenge. Is it wired or battery-powered? A system that is battery-powered could be placed anywhere, but it could potentially limit the camera’s performance in terms of image processing so as to not drain the battery too quickly. A system that requires to be plugged in, on the other hand, might have to be placed in a less than ideal location and might struggle to capture good images. Other installation considerations include the ease of setup, the network, and power, application control as well as cloud usage and storage.

Another constraint on image quality is the network the security system runs on. The uploading of recorded images is subject to compression, which affects the final image, especially on battery-powered systems. The quality of the network will also affect the final image. If two bars of wifi are lost during an upload, compression artifacts will most likely show up in the final image.

Home surveillance cameras also face audio quality challenges and limitations. What’s important is whether voices are intelligible in these kinds of products—in playback and recording. Outdoor cameras are prone to artifacts such as wind noise, and as a result, speech becomes hard to understand. In the case of a doorbell camera, the audio quality is affected by the distance of the speaker to the device.  The loudness of the captured of voices is important, especially in comparison with the volume of the background (Signal to Noise Ratio) which may be affected either by the distance from the user to the device or by the environmental conditions (wind, rain, road noises, …).

Doorbell benchmark

With image and audio quality being such an integral part of home security systems, it seems natural that DXOMARK, an industry leader in image and audio quality evaluation, would develop a testing protocol for the different types of home surveillance systems.

For our first image quality tests on the home surveillance market, we tested 4 doorbell cameras: Google Nest Doorbell (battery), Google Nest Hello (wired), Ring Video Doorbell 4, and the Arlo Essential Video Doorbell. All were battery-powered except for the Google Nest Hello. Of the four doorbell cameras, only the Google Nest Doorbell provided a vertical field of view instead of a horizontal one. A vertical field of view provides a head-to-toe view of the person in front of the camera as well as a view of the ground, where delivered packages are likely to be left.

DXOMARK tests image quality of video doorbells 3
DXOMARK tests image quality of video doorbells 4

The introductory protocol tested the doorbell cameras for all relevant attributes of image quality. It pays particular attention to the attributes of exposure, detail preservation, and artifacts, as they are identified as the most important for the doorbell use cases. Installation, accurate detection, and audio were not evaluated.  Videos were downloaded for analysis and were not evaluated via smartphone applications.

As the videos and illustrations will show, the doorbell cameras were tested in laboratory and real-life conditions. DXOMARK’s methodology combines both objective and perceptual evaluations to arrive at an overall assessment.

Let’s take a closer look at what DXOMARK’s laboratory and perceptual testing revealed about the image quality of doorbell cameras in three conditions: Daylight, Backlit, and Night.

Daylight use cases

When it comes to exposure in daylight conditions, with sunlight on the subject, faces were recognizable and accurately exposed in the images taken with the  Google, Ring cameras. While faces were recognizable in the Arlo camera, they were overexposed. The  Arlo tended to clip the face in very sunny conditions, because of high target exposure.

All cameras showed an acceptable level of detail in their images, but it depended on the subject’s distance from the camera. Identifying the individual was possible for close targets, but when the subject stood  2 meters or more away from the camera, facial details were almost completely lost. As the video shows, we have subjects approaching the camera from a distance in order to see how the level of details changes.

Some compression artifacts, such as blocking, can appear in the scene, especially with the subject’s movements. This happens when the video codec can’t keep up with the changing pixel information induced by the subject’s movement. More motion means more pixels have to change from one frame to another, and the more likely blocking could occur.

Arlo Video Doorbell (battery)

Ring Video Doorbell 4 (battery)

Google Nest Hello (wired)

DXOMARK tests image quality of video doorbells 5
Target Exposure Graph EV0. The EV (exposure value) reflects the dynamic of the lab scene, as the difference between the light on the realistic mannequin and the light produced by the LED panel in the scene.
DXOMARK tests image quality of video doorbells 6
The two graphs show the results from our lab tests for target exposure and texture. Target exposure (top)  was accurate for all four cameras, with the Arlo showing the highest level of exposure, which is too high. In the graph above, details are low, but acceptable, for the Ring and Google Nest Hello Wired.

Backlit conditions create a strong dynamic in the scene, which is the case if the camera is installed on a porch, or covered entrance, for example. All cameras struggle to give an accurate target exposure on the face, and recognition is not assured. In our tests, the battery-powered  Google Nest managed to keep quite a high level of details on faces in this type of condition, while the others had low details.

The following videos of an HDR backlit condition illustrate a common use case of a daytime package delivery, and the doorbell camera is placed in a covered entrance.

Arlo Essential, underexposes

Google Nest Hello (wired)

In lab conditions, we can see as well that the target exposure measurements are lower in EV4 conditions than in EV0 conditions for all cameras.

DXOMARK tests image quality of video doorbells 7

In a backlit condition, the Arlo Essential, however,  overexposes the image. The Nest doorbell battery starts to have slightly low target exposure.

DXOMARK tests image quality of video doorbells 7

 The Arlo Essential has a very limited dynamic range (low entropy). Other doorbells are similarly managing to keep some details in bright parts (The Google Nest battery and Ring are superimposed on one another  on this graph because the results were the same. )

Night use cases

The doorbell cameras’ behavior was interesting when it came to night use cases.  When light levels decrease or are really low, the cameras switch to night vision, or infrared mode, to better see.

The cameras had different thresholds at which they would activate night vision. The Ring Doorbell 4 switches to night vision at a lower lux level than the other doorbells. In very low light conditions where there was just one light on the model, most doorbells struggled to not overexpose the face. As the videos below show, in some cases, the subject’s face was so overexposed that it was impossible to identify the person.

In the following examples of a night scene, the cameras recorded the scene in IR. A small light simulated a lamp over the front door.

Arlo Essential, low detail preservation

Google Nest Hello, accurate exposure on the subject, but low details

Ring Video Doorbell 4, target exposure is the background and not the face.

.

In the laboratory examples below of infrared capabilities, we can see how the difference among the cameras when it comes to preserving details. 

DXOMARK tests image quality of video doorbells 9

Arlo Essential Video Doorbell

DXOMARK tests image quality of video doorbells 10
DXOMARK tests image quality of video doorbells 11

Google Nest Hello (wired), lowest target exposure, but better details

DXOMARK tests image quality of video doorbells 12

Conclusion

The DXOMARK doorbell camera protocol evaluates exposure, detail preservation, artifacts, and color.

Based on our evaluations of one wired doorbell camera and three battery-powered ones,  the Google Nest Hello (wired), gave better image results overall, providing high enough quality to identify people in most conditions, even at night. Overall, battery-powered doorbells held up well in comparison, but the Ring and Google Nest (battery) struggled when it came to low light and night conditions. Still, all doorbell cameras are challenged to some extent to provide correct exposure and facial details when confronted with  HDR conditions such as sunsets or covered entrances.

We didn’t talk about color in this initial article, since accurate and pleasant colors on surveillance cameras are nice to have but are not a mandatory feature. In low-light conditions, the cameras switched to infrared mode, where color information is lost. But in the daylight use cases we tested, all cameras showed some sort of color casts in daylight use cases in both perceptual and lab testing.

DXOMARK tests image quality of video doorbells 13

Results of the DXOMARK’s doorbell camera benchmark, a protocol of home security cameras.

DXOMARK is evaluating other types of home security cameras, as well, such as outdoor and indoor cameras. More articles about how those cameras performed are coming soon.

Source link

Posted on Leave a comment

Here’s How Good the Image Quality Is From the Sony a7 IV

Here’s How Good the Image Quality Is From the Sony a7 IV

The Sony a7 IV is probably one of the best bang-for-buck cameras on the market right now. With its high-resolution 33-megapixel full frame sensor along with a slew of high-end video features, it fits the needs of professional creatives. If you’re thinking about purchasing this camera but want to know how it performs, then this video might be useful to you. 

A recent video from Tony & Chelsea Northrup compares the Sony a7 IV to the Canon EOS R6, the Sony a7R III, the Sony a7 III, and the Sony a9 cameras. The main reason these kinds of tests are useful is that spec sheets and press releases don’t always give a clear view of how a camera will perform.

For example, the Sony a7 IV sensor has an AA filter, which although can help prevent moire, also reduces image quality. It’s difficult to know how much of an impact this AA filter is going to have without doing a side-by-side comparison. Other factors to test and consider include banding in artificial light, high-ISO performance, and how the sensor handles noise. In the video, Northrup goes into detail on how each camera performs and demonstrates some of the key improvements in the new camera. 

Find out how the Sony a7 IV performs by watching the full video linked above. 

Source link

Posted on Leave a comment

Unistellar eVscope 2 Telescope Review: Fun, But Lacks Image Quality

Unistellar eVscope 2 Telescope Review: Fun, But Lacks Image Quality

Unistellar eVscope 2 Telescope Review: Fun, But Lacks Image Quality 14

A few months ago Unistallar and Nikon announced the launch of a new smart-telescope called the eVscope 2. The two companies claimed it was the world’s most powerful and simple-to-operate digital telescope for consumers. While I absolutely agree the system is incredibly easy and actually fun to use, dubbing it as the world’s most powerful is a hefty over-promise.

The company sent us the $4,200 telescope to test out around Los Angeles where light pollution would be a challenge but we also coincidently struggled with a period of consecutive cloudy and hazy days where visibility was incredibly low. Despite taking it out on multiple evenings over the two weeks I had the device, there were only really a few brief moments of clear skies available. That being said, despite my issues with mother nature I found the device to be really fun to use with my friends.

Unistellar eVscope 2 Telescope Review: Fun, But Lacks Image Quality 15

Design and Build Quality

Out of the box, the telescope is surprisingly compact and comes with a comfortable and well-designed backpack and a “customized” tripod for safe and easy transport and setup. The tripod sits on the outside of the bag while the telescope itself slides into a custom-fitted backpack with foam inserts and straps to keep it safe while you’re moving around. This includes a small padded “bag” on the inside to hold extra USB-power supplies, cables, tools, and anything else you may find yourself wanting to store in there.

Unistellar eVscope 2 Telescope Review: Fun, But Lacks Image Quality 16

Unistellar eVscope 2 Telescope Review: Fun, But Lacks Image Quality 17

The new eyepiece, which is designed in partnership with Nikon, is a nice new feature that works relatively well. The previous (and much cheaper) model, the eQuinox, relies entirely on a connected app to be able to view and make adjustments.

I can kind of see why.

While I did use the viewfinder to see how things looked compared to on the screen of the app, I didn’t use it as much as I thought I would since you had to make movements for the telescope using the app anyway. It was often was much easier to rely on that than checking and making manual adjustments back and forth. It is worth noting that the telescope is an entirely digital device that requires it to be powered up to use. That means the eyepiece is also electronic so there is no real option for analog or manual adjustments, and you cannot modify the telescope to accept another camera like you could with traditional, non-electronic telescopes.

According to the company website and spec sheets, both the eVscope and the eQuinox telescopes are about the same size and weight, and both have a 4.5-inch diameter mirror inside. However, the eVscope does have better resolving power and a larger field of view. The company says the eVscope has an effective focal length of 450mm and a sensor capable of capturing 7.7-megapixel images with a digital magnification factor of up to 400x.

The only thing I didn’t really like is that the scope doesn’t have a “handle” on it, which made me really nervous when taking it out of the bag and mounting it on the tripod in a dark environment. It isn’t heavy or awkward, but it did feel nerve-wracking to handle such an expensive piece of technology without a real “grip” on it. The last thing I would ever want is to slip and drop the scope as I was getting it in position.

Unistellar eVscope mounted on tripod

Mobile App and Focusing

The telescope can be used to manually find and view stellar objects (if you happen to be more experienced with telescopes than I am) or you can pair it with the free app on Android and iOS devices and let it automatically seek out and find constellations, planets, and other celestial objects that are visible based on your GPS and elevation data.

The eVscope 2 can rotate a full 360-degrees on its base while tilting up and down to find and track celestial events, and while it may be a bit jumpy making adjustments manually, using the app’s list of available objects to choose from is impressively fast and smooth to locate and track.

As someone who is absolutely not familiar with using a telescope, I have to admit, the automation within the smartphone app was incredibly fun and easy to use. On the first use of the app, the telescope uses an autonomous field detection to find its own place on the planet (and relative space) by comparing the GPS information from the connected smartphone and comparing it to what it can see itself in the sky. From there it was quick and clear how to use the automated tools to find a celestial object and have the scope automatically slew to its position, leaving the user to ensure the focus was set.

Unistellar Mobile App - Slewing to Polaris

Based on whatever region the user is in, the app will tell you what is available to be seen through the scope, including recommended objects, leaving the user with just a click for the scope to get moving to the object. It is at this point where things get a little tricky for rookie users like myself. When focusing on objects at such a distance, the company recommends using a device called a bahtinov mask, which is conveniently built into the lens cover of the telescope.

eVscope with bahtinov Mask on
Bahtinov Mask on the telescope
Unistellar eVscope 2 Telescope Review: Fun, But Lacks Image Quality 18
Bahtinov mask removed

Focusing with bahtinov mask using the Mobile App

Using the mask and once pointed at a “bright” celestial object, it will create a distinct crosshairs pattern or diffraction spikes that you need to align together, creating a sort of X to achieve optimal focus (the middle spike will be dead center of the X). As an amateur, this was rather frustrating to do properly since when you touch the telescope to adjust the focus ring (located at the bottom of the scope) it can significantly shake and move the scope’s point of view. Pretty much all of the images from the first night out that I shot were at least slightly out of focus, if not entirely, but I feel as though that was still mostly my user error more than anything else.

blurry stars

out of focus Vega

Once focus is achieved, the rest is rather easy. Just choose the celestial object you want to view, let the telescope find and center it, and then you can either save a quick image taken in just a few seconds or enable enhanced viewing mode where the telescope will track and take multiple exposures of the image up to several hours worth, automatically layering them in the app and providing a much brighter, and sharper image to view. My evenings of testing were rather short so I was not able to test out a completely overnight shot, however below you can view a few images that were captured using this method from the Unistellar community (shared with permission) and cleaned up in Adobe Lightroom and Photoshop.

NGC891_Galaxy_Unistellar

Running Man Nebula_Unistellar

Triangulum Galaxy_M33_Unistellar

Despite the focusing headaches and weather not cooperating, the one thing I did find truly fun about this telescope is how you can share the experience with up to 10 people in the area using the app. As a primary user, you can share the device to multiple other smart devices and allow them to see what is happening in real-time as well as allowing them to take images for themselves while the primary user is making adjustments. The images that are captured on the smart devices are not exactly high-resolution, and are typically just a few megabytes in size and at a resolution even smaller than a typical smartphone photo (seven-megapixels or smaller).

While this can be a lot of fun for everyone with you, as a photographer or someone interested in astrophotography, the image quality is pretty underwhelming.

Image Quality and the “Process”

As mentioned above, the images taken with connection via a smartphone are typically a low resolution of seven megapixels or less in size. This is not by itself necessarily a bad thing, but I found it to be pretty underwhelming when you have a telescope of this price that also can’t provide any immediate method to create a high-resolution image. I say “immediate” because you can connect the eVscope to your network at home and tell the scope to upload the images to the Unistellar servers and have the company send you images at higher resolution.

Does it work? Yes. Is it easy to do and confirm? No, it is not. Are the images better than the tiny versions taken with a smartphone? Arguably, also no, they are not.

First off, this is one place where the application and workflow needs some significant improvements. When you have the telescope at home and on your network, you can tell it to upload the images stored on the telescope to the Unistellar servers. When you do this, the app kind of just shuts down and does not let you know the progress of the upload or if it even started it, let alone finished. The only thing you can do is, after a few hours, reconnect to the telescope to see if the memory is still being used or not. If it is now freed up, you can assume the images have been uploaded and the internal memory is cleared for additional use.

Unistellar eVscope 2 Telescope Review: Fun, But Lacks Image Quality 19

Unistellar eVscope 2 Telescope Review: Fun, But Lacks Image Quality 20

Unistellar eVscope 2 Telescope Review: Fun, But Lacks Image Quality 21

Unistellar eVscope 2 Telescope Review: Fun, But Lacks Image Quality 22

Getting these images as a RAW file is a whole additional challenging ballgame. To do so, you have to contact the Unistellar customer support, provide the serial number of the telescope (as well as your email and contact info) from which the representative will go through the servers to find and prep the images for delivery back to you. This could be as quick as a day or, in my case, nearly a week before the download links were provided.

Once received, I have to admit I was once again rather underwhelmed.

Granted I am no astrophotographer — and will never claim to be — but for some reason, I was expecting more out of the “RAW” stacked files than what was provided. In my short shooting tests of the M57 Ring Nebula for a two-minute exposure, somehow the mobile phone app was able to do some behind-the-scenes magic and create a better and more enhanced image than I was able to with the 47 image stack in Photoshop.

m57 Ring Nebula mobile phone snapshot
iPhone App Image Save
m57 Ring Nebula Photoshop RAW Stack
Adobe Photoshop Stack

After speaking with the Unistellar team, the company did say that most of the user base only use the images saved directly to their smart devices, and only a small percentage request access to the larger RAW files. Unistellar says it still plans on making improvements to the interface and the backend tracking for users who want to gain access to the RAW files and access more advanced processing capabilities but were not able to provide any sort of timeline for that feature as of yet. So be aware, if you are looking to use this telescope to access and use the RAW files, currently you will have to contact customer support directly each time you make an upload.

This feels like a system that is destined to fail given any kind of load. I was one of only a handful of reviewers who even have access to the eVscope right now and I had to wait a week to see my images. I can only imagine how long that wait can extend to after the product gets into more people’s hands.

Unistellar eVscope 2 Telescope Review: Fun, But Lacks Image Quality 23

Unistellar eVscope 2 Telescope Review: Fun, But Lacks Image Quality 24

Unistellar eVscope 2 Telescope Review: Fun, But Lacks Image Quality 25

Unistellar eVscope 2 Telescope Review: Fun, But Lacks Image Quality 26

Unistellar eVscope 2 Telescope Review: Fun, But Lacks Image Quality 27

Unistellar eVscope 2 Telescope Review: Fun, But Lacks Image Quality 28

Really Fun, But Not for Photographers

While I honestly had an absolute blast with my friends each time I had the opportunity to take the eVscope out to test, as an imaging professional I was more than underwhelmed, especially given the $4,200 asking price. After speaking with the team at Unistellar, the company did confirm that it had no intention of competing with professional top-tier astrophotography telescopes, as its goal was to get the device in the hands of hobbyists and enthusiasts to share the experience of live celestial observation.

As a photographer, I was not overly impressed with the images I was able to get with the eVscope 2 telescope, but that does not mean I did not have a ton of fun using it. Sending this device back was actually a rather sad moment, and I wished I was able to use it more and in an environment with a darker night sky.

Still, you have to look at this device and understand that it’s basically a big camera made in collaboration with Nikon, a camera company. The method of accessing RAW files on a case-by-case basis in which you are required to get customer support involved instead of having a way to just connect the telescope to a computer for fast image downloads is honestly a glaringly huge missed opportunity.

The eVscope is not meant to be a professional photo level telescope, I get that, but it could be something incredibly fun and more useful for professionals or even hobbyists were that feature an option. Additionally, I found focusing the telescope to be very slow and frustrating as a rookie telescope user. Given the device is incredibly “smart,” I was left wondering why it could not autofocus on its own. The Unistellar team has said that they are researching and developing an autofocus mode, but for now, they say the fastest and easiest way is to use the provided Bahtinov mask.

Are There Alternatives

The STELLINA by Vaonis is one of the more recent alternative smart telescopes that is pretty close to the eVscope in its target market, similarly providing images on a smartphone at a price about $100 to $200 less, but weighs significantly more.

Otherwise, you might want to look into a standard telescope with a camera mount adapter. This is significantly less “smart,” but with a bit of practice, at least you have near-immediate access to higher resolution photos.

Should You Buy It?

Probably not. If you are even a beginner-level photographer who is looking for a telescope to help you get bigger and better night sky photographs or are just looking for a small “beginner” telescope to get started with astrophotography in your backyard, this is probably not the device for you. Higher skill level photographers will be even less impressed. While you can get some incredible images of the night sky quickly and easily, the resolution and quality are not there yet.

Source link

Posted on Leave a comment

A big leap in image quality

A big leap in image quality

The Pixel 6 Pro is the 2021 flagship in Google’s Pixel line of smartphone, featuring a 6.7-inch OLED LTPO display with 120Hz refresh rate and QHD+ resolution, Google’s brand new in-house-developed Tensor chipset and up to 512GB of RAM.

It is also the first Pixel phone to feature a triple camera with ultra-wide and tele modules accompanying the primary shooter. At 1/1.31″ the image sensor in the latter is almost twice the size of its equivalent in the Pixel 5 generation and offers a 50 MP resolution that is downsampled to 12.5 MP for the final output image. Landscape photographers and other wide-angle shooters can rely on an ultra-wide module with a 16mm-equivalent field of view, and a 4x tele lens allows you to zoom into your subjects from a distance.

Let’s see how the Google Pixel 6 Pro performed in the DXOMARK Camera test.

Key camera specifications:

  • Primary: 50 MP 1/1.31″ sensor, 1.2µm pixels, 24 mm equivalent f/1.85-aperture lens, OIS, Dual PDAF
  • Ultra-wide: 12.5 MP 1/2.86″ sensor, 1.25μm pixels, 16mm equivalent f/2.2-aperture lens
  • Tele: 48 MP 1/2.0″ sensor, 0.80μm pixels, 102.6mm equivalent (4x) f/3.5-aperture lens, OIS, PDAF
  • LDAF (laser detection autofocus) sensor
  • LED Flash
  • 4K at 30/60fps (4K/30fps tested)

About DXOMARK Camera tests: For scoring and analysis in our smartphone camera reviews, DXOMARK engineers capture and evaluate over 3000 test images and more than 2.5 hours of video both in controlled lab environments and in natural indoor and outdoor scenes, using the camera’s default settings. This article is designed to highlight the most important results of our testing. For more information about the DXOMARK Camera test protocol, click here. More details on how we score smartphone cameras are available here.

Test summary

A big leap in image quality 29
Google Pixel 6 Pro

A big leap in image quality 30

135

camera

Pros

  • Good detail in bright light and indoor images, as well as in video
  • Good shadow detail and contrast
  • Nice and accurate color in photo and video
  • Fast and accurate autofocus in bright light and indoor conditions
  • Excellent detail in long range tele shots
  • Effective video stabilization
  • Good exposure and wide dynamic range in video

Cons

  • Narrow depth of field results in blurry background subjects in group shots
  • Noise in indoor and low-light images
  • Depth estimation errors and instabilities in bokeh shots
  • Bokeh blur effect not visible in preview
  • Ultra-wide camera not as wide as competition
  • Color instabilities and noise in video
  • Occasionally unstable video autofocus in low light

With a DXOMARK Camera overall score of 135 the Google Pixel 6 Pro puts Google back into the group of manufacturers that is battling it out for the smartphone camera crown, making the device, at least from an imaging point of view, the best option for Android users in the US market by surpassing the competition from Samsung and Asus.

Overall the new Google phone delivers an outstanding Photo performance and great Video quality. Thanks to the introduction of the new 4x tele lens it also does very well in the Zoom category. Compared to one of its main rivals, the Apple iPhone 13 Pro Max, Photo and Zoom are on par but with different characteristics. The Pixel is ahead in terms of exposure and texture. The iPhone does better for some other categories, such as preview or autofocus. For Video the Pixel 6 Pro still slightly lags behind the iPhone, especially in terms of capturing high-contrast scenes, something that the iPhone 13 Pro Max excels at.

A big leap in image quality 32

The Google Pixel 6 Pro produces excellent image quality in most situations.

The excellent Photo score of 143 is based on a consistently good performance across all still image test categories. On the Pixel 6 Pro Google paid particular attention to contrast as well as portrait quality and skin tones and overall the camera is very reliable and a big step forward from previous Pixel models that had to rely on smaller sensors and less processing power. Still images show good exposure, with only some slight underexposure in difficult backlit scenes. The camera also delivers very high level of details in outdoor and indoor conditions but noise is often visible when shooting under indoor lighting or in low light. The autofocus works reliably in most conditions, too, but the large sensor of the primary module means that depth of field is quite limited, blurring subjects that are located behind the focal plane.

In terms of the Zoom score the Pixel 6 Pro mainly benefits from the 4x tele camera that delivers great results at long range, although the level of detail suffers at shorter zoom settings. Compared to the best in class the ultra-wide camera comes with some limitations, however. At 16mm-equivalent, the field of view is not very wide, the level of detail could be better and our engineers also observed some noise and exposure instabilities. On the plus side, anamorphosis is well corrected.

The Video score of 115 puts the Pixel 6 Pro into the top ten for this category as well. Video clips benefit from a wide dynamic range and good exposure in most conditions. Colors are nice in most situations and the level of detail is high, especially when shooting in bright light. The autofocus works mostly accurately, except in low light where failures can occur. The Pixel’s stabilization system does a good job at keeping things steady and even when running or walking while recording only some slight shake is noticeable. On the downside, noise as well as exposure and white balance instabilities can be noticeable in some situations.

Photo

The Google Pixel 6 Pro achieves a Photo score of 135. In this section, we take a closer look at each sub-attribute and compare image quality against competitors.

A big leap in image quality 33

Exposure and Contrast

Huawei P50 Pro

Best: Huawei P50 Pro (111)

In these tests we analyze target exposure, contrast, and dynamic range, including repeatability across a series of images. Tests are undertaken in a wide range of light conditions, including backlit scenes and low light down to 1 lux. The score is derived from a number of objective measurements in the lab and perceptual analysis of real-life images.

These samples show the Google Pixel 6 Pro’s exposure performance in a backlit scene.

A big leap in image quality 34
Google Pixel 6 Pro, accurate target exposure, wide dynamic range with slight highlight clipping
A big leap in image quality 35

Apple iPhone 13 Pro Max, accurate target exposure, limited dynamic range with stronger highlight clipping

A big leap in image quality 36

Huawei P50 Pro, slightly underexposure on left model, wide dynamic range

This graph shows the Google Pixel 6 Pro’s average contrast entropy in lab conditions.

A big leap in image quality 37

Average entropy comparison: the Pixel 6 Pro shows an extended dynamic range when compared to its competitors, especially in high dynamic range scenes (EV2 and EV4).

A big leap in image quality 38

Color

In these tests we analyze color rendering, skin tones, white balance, and color shading, including repeatability across a series of images. The score is derived from a number of objective measurements in the lab and perceptual analysis of real-life images.

Note that even though the Google Pixel 6 Pro does an excellent job color-wise in most conditions, a slight cast can be visible on occasion. These samples show the Google Pixel 6 Pro’s color performance in outdoor conditions.

A big leap in image quality 39

Google Pixel 6 Pro, accurate white balance, vivid colors

A big leap in image quality 40

Apple iPhone 13 Pro Max, slightly blue white balance

A big leap in image quality 41

Huawei P50 Pro, accurate white balance

These samples show the Google Pixel 6 Pro’s color performance in indoor conditions.

A big leap in image quality 42

Google Pixel 6 Pro, accurate white balance, pleasant skin tones and color rendering

A big leap in image quality 43

Apple iPhone 13 Pro Max, warm white balance, pleasant skin tones and color rendering

A big leap in image quality 44

Huawei P50 Pro, warm white balance, pleasant skin tones and color rendering

These samples show the Google Pixel 6 Pro’s color performance in low light.

A big leap in image quality 45

Google Pixel 6 Pro, accurate white balance and color rendering

A big leap in image quality 46

Apple iPhone 13 Pro Max, accurate white balance and color rendering

A big leap in image quality 47

Huawei P50 Pro, accurate color rendering, slightly orange white balance

A big leap in image quality 48

Autofocus

Asus Smartphone for Snapdragon Insiders

Best: Asus Smartphone for Snapdragon Insiders (109)

In these tests we analyze autofocus accuracy and shooting time, including repeatability, in the lab. We test focus failures, depth of field, and tracking of moving subjects using perceptual analysis of real-life images.

This graph shows the Google Pixel 6 Pro’s autofocus performance in the lab in outdoor high dynamic range conditions (1000 lux, 4EV, handheld).

A big leap in image quality 49

The Google Pixel 6 Pro autofocus is fast and accurate even in challenging high-contrast scenes.

These samples show the Google Pixel 6 Pro’s autofocus and depth of field in outdoor conditions. The lack of a variable aperture (as you would have on most dedicated cameras) makes depth of field in this kind of scene a challenge for almost all current ultra-premium devices.

Google Pixel 6 Pro, depth of field

A big leap in image quality 51

Google Pixel 6 Pro, crop: limited depth of field, good focus on foreground, middle and background out of focus

Apple iPhone 13 Pro Max, depth of field

A big leap in image quality 53

Apple iPhone 13 Pro Max, crop: limited depth of field, good focus on foreground, middle and background out of focus

Huawei P50 Pro, depth of field

A big leap in image quality 55

Huawei P50 Pro, crop: extended depth of field, focus on foreground and middle plane

A big leap in image quality 56

Texture

Xiaomi Mi 11

Best: Xiaomi Mi 11 (111)

In these tests we analyze texture on faces and objects, including objects in motion, in a range of light conditions, using several lab test setups and perceptual analysis of real-life images.

These samples show the Google Pixel 6 Pro’s texture performance in daylight conditions.

Google Pixel 6 Pro, outdoor texture

A big leap in image quality 58

Google Pixel 6 Pro, crop: excellent detail, fine detail preserved

Apple iPhone 13 Pro Max, outdoor texture

A big leap in image quality 60

Apple iPhone 13 Pro Max, crop: good detail on face, slight loss of fine detail

Huawei P50 Pro, outdoor texture

A big leap in image quality 62

Huawei P50 Pro, crop: good detail on face, slightly unnatural rendering

The fine detail preservation in the real-life scene above can also be seen and measured in the lab. This graph shows the Google Pixel 6 Pro’s texture performance compared to the competition in the lab across varying light levels. We measure texture on various crops of our perceptual test scenes. The Pixel 6 Pro’s results mounted on a tripod are very high but lower handheld which is more relevant to most users.

A big leap in image quality 63

Texture comparison (20 lux, A illuminant, tripod): high levels of details measured in most lab tested conditions, especially when the device is mounted on a tripod

 

This sample shows the Google Pixel 6 Pro’s texture performance in low light in the lab.

Google Pixel 6 Pro, detail at 20 lux, phone mounted on tripod

A big leap in image quality 65

Google Pixel 6 Pro, crop: high level of detail in low light

Apple iPhone 13 Pro, detail at 20 lux, phone mounted on tripod

A big leap in image quality 67

Apple iPhone 13 Pro, crop: low level of detail

Huawei P50 Pro, detail at 20 lux, phone mounted on tripod

A big leap in image quality 69

Huawei P50 Pro, good detail but not as good as Pixel 6 Pro

A big leap in image quality 70

Noise

Huawei P50 Pro

Best: Huawei P50 Pro (99)

In these tests we analyze noise on faces and objects, including objects in motion, in a range of light conditions, using several lab test setups and perceptual analysis of real-life images.

This graph shows the Google Pixel 6 Pro’s noise performance in the lab across light levels.

A big leap in image quality 71

Noise comparison (smaller value is better): some noise is measured for all lighting conditions on the Pixel 6 Pro but noise levels are lower than on the iPhone 13 Pro Max for indoor conditions and low light. The P50 Pro has lower noise in such conditions.

These samples show the Google Pixel 6 Pro’s noise performance in indoor conditions.

Google Pixel 6 Pro, indoor noise

A big leap in image quality 73

Google Pixel 6 Pro, crop: slight noise

Apple iPhone 13 Pro Max, indoor noise

A big leap in image quality 75

Apple iPhone 13 Pro Max, crop: some noise

Huawei P50 Pro, indoor noise

A big leap in image quality 77

Huawei P50 Pro, crop: well-controlled noise

A big leap in image quality 78

Bokeh

Huawei P50 Pro

Best: Huawei P50 Pro (80)

For these tests we switch to the camera’s bokeh or portrait mode and analyze depth estimation, bokeh shape, blur gradient, and repeatability, as well as all other general image quality attributes mentioned above. The score is derived from perceptual analysis of real-life images.

On the Google Pixel 6 Pro’s bokeh shots, the level of detail in sharp areas is very high and higher than on the comparison devices. This is especially nice for portrait shots where the sharpness of the subject contrasts nicely with the background blur that has a nice shape to it. These samples show the Google Pixel 6 Pro’s bokeh simulation in daylight.

Google Pixel 6 Pro, bokeh mode

A big leap in image quality 80

Google Pixel 6 Pro, crop: depth artifacts on subject

Apple iPhone 13 Pro Max, bokeh mode

A big leap in image quality 82

Apple iPhone 13 Pro Max, crop: slight depth artifacts

Huawei P50 Pro, bokeh mode

A big leap in image quality 84

Huawei P50 Pro, crop: natural subject segmentation, almost no depth artifacts

A big leap in image quality 85

Night

Huawei Mate 40 Pro+

Best: Huawei Mate 40 Pro+ (82)

In these tests we shoot a selection of images in pitch-black darkness as well as with city lights in the background providing some illumination. We shoot sample images with the camera at default settings in both flash-auto and flash-off modes. We analyze all image quality attributes but we pay particular attention to exposure, autofocus, and color. We do not test night modes that have to be activated manually.

These samples show the Google Pixel 6 Pro’s night performance in flash-on mode. Please note that the Pixel 6 Pro does not offer a flash-auto mode, leaving the decision whether to use flash or not to the users.

A big leap in image quality 86

Google Pixel 6 Pro, flash-on: accurate target exposure, good detail on face, strong underexposure on background, low noise, accurate white balance

A big leap in image quality 87

Apple iPhone 13 Pro Max, flash-auto: accurate target exposure on face, slight color cast, accurate target exposure on background, shadow clipping, very strong loss of detail, noise

A big leap in image quality 88

Huawei P50 Pro, flash-auto: slight underexposure on face, accurate target exposure on background, very low level of detail

A big leap in image quality 89

Artifacts

In these tests we check images for optical artifacts such as vignetting, flare, lens softness in the corners, distortion, and chromatic aberrations, as well as for processing artifacts such as ghosting and fusion errors, hue shift, and ringing.

This image shows an example of color fringing on the Google Pixel 6 Pro. Overall, most common artifacts are well corrected but color fringing and aliasing are often visible. Please also note the cold white balance cast in this image.

Google Pixel 6 Pro, artifacts

A big leap in image quality 91

Google Pixel 6 Pro, crop: color fringing

Apple iPhone 13 Max, artifacts

A big leap in image quality 93

Apple iPhone 13 Max, crop: slight color fringing

Huawei P50 Pro, artifacts

A big leap in image quality 95

Huawei P50 Pro, crop: no fringing

A big leap in image quality 96

Preview

Apple iPhone 13 Pro Max

Best: Apple iPhone 13 Pro Max (80)

In these tests we analyze the image quality of the preview image and the differences between preview images and captured images, particularly in terms of exposure, dynamic range, and bokeh effect. We also check the smoothness of the field-of-view changes in the preview image when zooming with both buttons or when using the pinch-zoom gesture.

These images show the Google Pixel 6 Pro’s preview performance in HDR conditions.

A big leap in image quality 97

Google Pixel 6 Pro, preview, difference to capture in dynamic range, more highlight clipping

A big leap in image quality 98

Apple iPhone 13 Pro Max, preview, difference to capture in dynamic range, more highlight clipping

A big leap in image quality 99

Huawei P50 Pro, preview, strong difference to capture in dynamic range, much stronger highlight clipping

A big leap in image quality 100

Google Pixel 6 Pro, capture

A big leap in image quality 101

Apple iPhone 13 Pro Max, capture

A big leap in image quality 102

These images show the Google Pixel 6 Pro’s preview performance in bokeh mode.

A big leap in image quality 103

Google Pixel 6 Pro, preview, no blur effect on bokeh mode preview

A big leap in image quality 104

Google Pixel 6 Pro, capture

Zoom

The Google Pixel 6 Pro achieves a Zoom score of 71. The Zoom score includes the tele and wide sub-scores. In this section, we take a closer look at how these sub-scores were achieved and compare zoom image quality against the competitors.

A big leap in image quality 105

Wide

Huawei P50 Pro

Best: Huawei P50 Pro (57)

In these tests we analyze the performance of the ultra-wide camera at several focal lengths from 12 to 20 mm. We look at all image quality attributes, but we pay particular attention to such artifacts as chromatic aberrations, lens softness, and distortion.

This sample shows the performance of the Google Pixel 6 Pro’s ultra-wide camera in outdoor conditions.

A big leap in image quality 106

Google Pixel 6 Pro, ultra-wide: limited field of view, slight noise, slight loss of detail, accurate white balance

A big leap in image quality 107

Apple iPhone 13 Pro Max, ultra-wide: wide field of view, slight noise, slight loss of detail, slight white balance cast

A big leap in image quality 108

Huawei P50 Pro, ultra-wide: wide field of view, well-controlled noise, slight loss of detail, slight white balance cast

A big leap in image quality 109

Tele

Huawei P50 Pro

Best: Huawei P50 Pro (140)

In these tests we analyze all image quality attributes at focal lengths from approximately 40 to 300 mm, paying particular attention to texture and detail. The score is derived from a number of objective measurements in the lab and perceptual analysis of real-life images.

These samples show the Google Pixel 6 Pro’s performance at a close range tele setting.

Google Pixel 6 Pro, close range tele

A big leap in image quality 111

Google Pixel 6 Pro, crop: slight loss of detail, well controlled noise, slight cast

Apple iPhone 13 Pro Max, close range tele

A big leap in image quality 113

Apple iPhone 13 Pro Max, crop: loss of detail, noise, slightly inaccurate color

Huawei P50 Pro, close range tele

A big leap in image quality 115

Huawei P50 Pro, crop: good detail, low noise, nice color

These samples show the Google Pixel 6 Pro’s performance at a long-range tele setting.

Google Pixel 6 Pro, long range tele

A big leap in image quality 117

Google Pixel 6 Pro, crop: good detail, slight noise

Apple iPhone 13 Pro Max, long range tele

A big leap in image quality 119

Apple iPhone 13 Pro Max, crop: loss of detail, slight noise

Huawei P50 Pro, long range tele

A big leap in image quality 121

Huawei P50 Pro, crop: good detail, low noise

This graph shows the Google Pixel 6 Pro’s texture performance at a long-range tele setting.

A big leap in image quality 122

Average texture at long-range tele: The Pixel 6 Pro’s performance put it between the Huawei P50 Pro and Apple iPhone 13 Pro Max.

Video

In our Video tests we analyze the same image quality attributes as for still images, such as exposure, color, texture or noise, but we also include such temporal aspects as speed, and smoothness and stability of exposure, white balance and autofocus transitions.

NOTE: The sample video clips in this section are best viewed at 4K resolution. 

The Google Pixel 6 Pro achieves a Video score of 115. A device’s overall Video score is derived from its performance and results across a range of attributes in the same way as the Photo score. In this section we take a closer look at these sub-scores and compare video image quality against competitors.

A big leap in image quality 33

Exposure and Contrast

Apple iPhone 13 Pro Max

Best: Apple iPhone 13 Pro Max (118)

These sample clips show the Google Pixel 6 Pro’s video exposure performance in low light conditions.

Google Pixel 6 Pro, accurate target exposure, wide dynamic range with slight clipping, slight exposure adaptation

Apple iPhone 13 Pro Max, accurate target exposure, wide dynamic range with slight clipping, no exposure adaptation (best played on HDR display)

Huawei P50 Pro, low target exposure, limited dynamic range with shadow and highlight clipping

A big leap in image quality 38

Color

Xiaomi Mi 11 Ultra

Best: Xiaomi Mi 11 Ultra (105)

These sample clips show the Google Pixel 6 Pro’s video color in a low light scene. In outdoor and indoor conditions white balance and color rendering are mostly accurate but in this low light clip white balance instabilities can be seen.

Google Pixel 6 Pro, visible white balance adaptation and instabilities in low light

Apple iPhone 13 Pro Max, visible white balance adaptation in low light (best played on HDR display)

Huawei P50 Pro, slightly visible white balance adaptation in low light

A big leap in image quality 48

Autofocus

Apple iPhone 13 Pro Max

Best: Apple iPhone 13 Pro Max (109)

These sample clips show the Google Pixel 6 Pro’s video autofocus performance in low light. In outdoor and indoor conditions focus is mostly fast and accurate but some failures can be observed in low light.

Google Pixel 6 Pro, autofocus instabilities

Apple iPhone 13 Pro Max, stable autofocus except some very slight refocusing, mostly fast and accurate in low light

Huawei P50 Pro, stable autofocus, mostly fast and accurate in low light

A big leap in image quality 56

Texture

Xiaomi Mi 11 Ultra

Best: Xiaomi Mi 11 Ultra (97)

These sample clips show the Google Pixel 6 Pro’s texture performance in daylight.

Google Pixel 6 Pro, good detail, including fine detail preservation

Apple iPhone 13 Pro Max, excellent detail with fine detail well preserved even in the shadows

Huawei P50 Pro, excellent detail, with well preserved fine detail

A big leap in image quality 70

Noise

Apple iPhone 13 Pro Max

Best: Apple iPhone 13 Pro Max (105)

This graph shows the Google Pixel 6 Pro video noise performance in the lab.

A big leap in image quality 128

Video noise comparison: temporal luminance and chromatic noise are noticeable, especially in indoor conditions and low light where noise levels are often higher than on competitors.

A big leap in image quality 89

Artifacts

Oppo Find X2 Pro

Best: Oppo Find X2 Pro (94)

For video artifacts, we check for the same kinds of artifacts mentioned in the Photo section, along with such video-specific artifacts as frame rate variation in different light conditions, judder effect, and moving artifacts (artifacts such as aliasing, color quantization, and flare can often be more intrusive when moving than in a still image).

This video still shows an aliasing effect in low light.

Google Pixel 6 Pro, video artifacts

A big leap in image quality 131

Google Pixel 6 Pro, crop: aliasing is often visible in all conditions

A big leap in image quality 132

Stabilization

In these tests we analyze residual motion when handholding the camera during recording, as well as when walking and running with the camera. We also look for stabilization artifacts such as jello effect, sharpness differences between frames, and frame shift (abrupt changes of framing).

These sample clips show the Google Pixel 6 Pro’s stabilization performance in daylight.

Google Pixel 6 Pro, very effective stabilization overall, only slight residual motion when walking or running while recording

Apple iPhone 13 Pro Max, slight residual motion when walking, strong residual motion when running

Huawei P50 Pro, effective stabilization, slight residual motion when running or walking

Source link

Posted on Leave a comment

How to Light Catalog Quality Product Shots with a Single Speedlight

How to Light Catalog Quality Product Shots with a Single Speedlight

In this nine-minute video from Workphlo, photographer Dustin Dolby explains how to light catalog quality reflective product photos using just a single Speedlight.

To start, Dolby suggests shooting the products separately to avoid them reflecting onto each other, making it easier to get an ultra-clean look for each piece. Dolby also suggests using an app called AutoRetouch to help make automatic clipping paths upon importing the images into Photoshop, saving a lot of time on post-production.

How to Light Catalog Quality Product Shots with a Single Speedlight 133

Dolby then shows his setup of a long strip of diffusion paper that extends from the lens to the product where he has placed a single Speedlight in a strip box. The light source is on a slight angle behind the product allowing quick and minor adjustments for the shot to get things just right. He says that reflective products will often get a “dark padding” on the product that will vary based on the Speedlights’ positioning.

Ideally, if the light source is positioned properly, the spread will fall on just about half of the product. Then it is just a matter of correctly placing a bounce card (reflector) on the opposite side of the product to fill in the shadows and recycle the light from the strip box. The depth of the bounce card placement will affect the padding position, so be sure to make adjustments accordingly to achieve a flattering and consistent light pattern to be used across all similar products.

How to Light Catalog Quality Product Shots with a Single Speedlight 134

Once a clean setup is achieved, he recommends photographers start rotating the products through the setup, making sure the light and padding fall in similar locations for consistency. Then, all that is left is to import the files into Photoshop to begin the retouching.

Dolby explains that using AutoRetouch can help speed things up by automatically creating clipping paths and removing backgrounds for product work, making bulk editing easier. In May, AutoRetouch added a feature that allows editors to bulk edit any product photo with artificial intelligence.

Once inside Adobe Photoshop, Dolby adds a reflection by duplicating the products, flipping them vertically, positioning the “reflection” directly below the originals, reducing the opacity significantly, and finally applying a linear gradient so that there is a smooth fall-off providing a nice, catalog e-commerce looking image. Finally, he shows how to fix any inconsistent shadows on the reflective parts of the product by copy and pasting stretched portions of the product over the existing setup and masking them in accordingly to achieve a very consistent look across the whole image. Beyond that, the amount of retouching and photoshop fixes is simply up to the user (or client).

How to Light Catalog Quality Product Shots with a Single Speedlight 135

For more from Dustin Dolby, subscribe to his YouTube channel.


Image credits: Photos by Dustin Dolby and used with permission.

Source link

Posted on Leave a comment

Best Compact Camera – Get better quality than your smartphone

Best Compact Camera - Get better quality than your smartphone

Looking for the best compact camera? Look no further, as this guide is for you.

With the best will in the world, it’s not always practical to have a large camera with you. That’s where a compact camera comes in extra handy, something that you can slip into your pocket ready to shoot at a moment’s notice.

The compact camera market is diverse, but it has undeniably changed a huge amount in the past few years. It’s no longer a case of simple point and shoots, as for the most part, your smartphone fulfils that job. Now, a compact camera has to offer something extra – that could be a larger sensor, a longer zoom, or something else entirely.

Here we’ll take a look at some of the best compact cameras you can currently buy.

Best compact camera: Sony Cyber-shot RX100 VII

Sony Cybershot RX100 VII review

The ultimate offering in portability and overall image quality has to be the Sony RX100 VII. Sony’s RX100 range is what introduced the one-inch sensor to the market, and the camera which others tend to follow. We’re now in the 7th generation from the original camera, and the tech which is packed into this miniature marvel is quite something.

Not only do you have a one-inch sensor, you get a 24-200mm (equivalent) lens which offers an f/2.8-f/4.5 maximum aperture. High-speed shooting is available – pretty incredible for a pocket camera – you can shoot at 24fps. The autofocus system is also pretty impressive, so you could conceivably use this camera to shoot sports and action.

Other exciting features include 4K video, inbuilt wi-fi and a tilting touchscreen. You also get a cleverly hidden electronic viewfinder which pops out from the corner of the camera.

So, what’s the drawback? Well – it’s the price. You need to pay top whack to get all of these features in such a small package, and the RX100 VII currently retails for around £1000. That’s a heck of a lot of money to spend on a compact camera, but you do get something seriously impressive for your cash. If you don’t have those kind of readies available, take a look at some of the older RX100 models. The RX100 V are still fantastic options, although the lens doesn’t have as much optical zoom.

Read our Sony Cyber-shot RX100 VII Review

Best compact camera: Panasonic Lumix TZ100

Best compact camera 2017 - TZ100

In most cases, compacts which feature a large (one-inch) sensor, have a restricted zoom. However, Panasonic’s TZ100 manages to bridge the gap between premium compacts and superzooms, with its 10x optical zoom offering.

While 10x doesn’t get near the heady heights of the 30 or 40x zooms elsewhere in this list, the 25-250mm equivalent should be more than enough for most situations. Alongside this, there’s a rich feature list which includes 4K video shooting, 10fps shooting, built-in WiFi and an electronic viewfinder. The screen is fixed, which is a shame when composing from awkward angles – but it perhaps helps to keep the overall size of the camera down.

Overall, this is a very likeable camera and it’s probably the best compromise of all the cameras here – you get a bit of everything for your cash, and the price isn’t outrageously high either. Image quality is very good, and while it’s not going to match your DSLR, the fact that you can fit it into your pocket makes it particularly appealing as a travel camera.

Read our Panasonic Lumix DMC-TZ100 review

Best compact camera: Panasonic Lumix LX15

Best compact camera 2017 - Panasonic Lumix DMC LX15

Aimed squarely at the Sony RX100 audience, the Panasonic LX15/LX10 is a small camera with a one-inch sensor and a 24-72mm equivalent focal length range. It goes one small step better than the RX100 V, offering a maximum aperture of f/1.4 at its widest angle, dropping to a still very usable f/2.8 at the far end.

Unlike the TZ100, the screen on the LX15 is hinged, meaning you can tilt it to face forward – which is useful for selfies, but also other awkwardly angled shots.

This being Panasonic, 4K video and 4K photo modes are included with the LX15 – both of which are appealing to a wide range of people. One big downside here though, especially for enthusiasts, is the lack of a viewfinder.

Another very likeable compact camera from Panasonic, which produces great images at a fraction of the price of the Sony RX100 V. Depending on what you need from a camera, this could be the better choice if you don’t want to spend too much.

Read our Panasonic Lumix DMC-LX15 review

Best compact camera: Canon G7X Mark II

Best compact camera 2017 - Canon G7X Mark II

Canon’s long established G range of premium compact cameras has a diverse line-up with something to suit most different users. The Canon Powershot G7X Mark II has a one-inch sensor and a 24-100mm equivalent zoom lens. That slight extra reach of the zoom lens when compared with the Sony RX100 V and Panasonic LX15 arguably makes it a tad more appealing, especially considering that the maximum aperture of f/1.8 still only drops to f/2.8 at this point.

The screen is tilting and touch-sensitive, but sadly there’s no inbuilt viewfinder here – something which would be extremely welcomed by enthusiast photographers. Furthermore, for those interested in capturing video, you’re limited to just Full HD with the Canon – that’s unlikely to be too much of an issue for the average shooter, but it helps to show the age of the camera.

Image quality is great, with attractive colours and a good low-light performance. If you already own a Canon DSLR, you’re likely to find sticking with the same brand very appealing.

Read our Canon PowerShot G7 X Mark II review

Best travel zoom: Panasonic Lumix TZ90

Best compact camera 2017 - T790

Panasonic’s latest travel zoom compact builds on the successes of all that came before it.

It features a 30x optical zoom, but in return for that large zoom range, you need to accept a smaller sensor than its one-inch comrade, the TZ100.

Probably the most well-featured superzoom compact on the market, as well as the huge zoom, you get an inbuilt viewfinder (albeit small), 4K video shooting, a touch-sensitive screen, manual controls, raw format shooting and a body which just about fits in your pocket.

A great choice for those looking for something to take on their travels, in low light it suffers by comparison to its larger sensor rivals. If you’re mainly going to be using it on your sunny holidays, you shouldn’t worry too much about that.

The Panasonic Lumix TZ90 was updated by the TZ95 with a slightly larger EVF and the addition of Bluetooth, but the TZ90 remains excellent value for money.

Panasonic TZ90 vs Panasonic TZ100

Best Vlogging camera: Sony Z-V1

Best Vlogging Camera: Sony ZV-1

Best vlogging camera, the Sony ZV-1 with wind muff

The Sony ZV-1 has been specifically designed to be an excellent compact camera for vlogging and video recording, and thanks to a 1inch sensor, a high quality lens, and some video friendly features, it delivers the goods.

It features 4K video recording and specific video features to make vlogging even easier, with a “Product Showcase” mode, as well as a Background Defocus switch. You’ll find a multi-direction microphone on top, which is provided with a “deadcat” designed to reduce wind noise, and a screen that can be turned around for vlogging and selfies.

If you want to use an external microphone you can, as there’s a microphone socket on the side, as well as HDMI output.

Read our Sony ZV-1 Review

Best Retro compact camera: Fujifilm X100V

Fujifilm X100V

The Fujifilm X100V isn’t just a good looking camera, it also takes some excellent photos thank to a 26MP APS-C X-Trans CMOS sensor, and an updated lens design improves macro performance. The lens on the front is a 23mm f/2.0 prime lens, giving a 35mm equivalent (in 35mm terms).

You’ll also find Fujifilm’s latest Film Simulation modes, with a number of black and white film options, including ACROS, as well as the option to add a film-like grain effect to images, great for those gritty black and white street photographs.

There’s a hybrid optical / electronic viewfinder that can give you the best of both worlds, and give you the true rangefinder camera experience. The touchscreen on the back tilts which can help with awkward angles, or when you want to “shoot from the hip”.

If you’re interested in recording video, then you’ll find the camera has 4K video recording at 30fps, although the lack of image stabilisation may be a deal-breaker for some.

Read our Fujifilm X100V Review

Best pocket camera: Ricoh GR III

Ricoh GR III

The Ricoh GR III, like the X100V, has an APS-C CMOS Sensor, which is impressive considering the compact size of the camera. It features an 18.3mm f/2.8 lens, equivalent to 28mm (in 35mm terms), and the camera has a clever “Snap” focus system so you can quickly get shots without any delay from focusing, making it another great street camera.

The Ricoh GR III is the digital version of the cult classic Ricoh GR film camera, and is designed to be a pocketable camera that you can take anywhere. There’s built-in sensor-shift shake reduction, that moves the 24MP sensor on 3-axis to counter any shake.

There’s a 3inch touch-screen on the back, and you’ll find dual command dials making it easier to change settings quickly.

Read our Ricoh GR III Review

Best Waterproof camera: Olympus Tough TG-6

Olympus Tough TG-6

Olympus has been making tough, waterproof, compact cameras for a very long time now, and it’s culminated in the Olympus Tough TG-6, the 6th version of the premium waterproof camera.

Over the years it’s been refined, with improvements made to image quality, video recording, and strength. You’ll even find there’s a range of accessories available for this camera that can improve close up flash performance or add extra protection to the camera.

The camera uses a 12MP sensor along with an f/2.0 lens which gives it an edge over entry-level waterproof cameras, and will help with the low-light conditions you find underwater. Thanks to the folded optics used in the construction of the lens, the camera has an impressive level of macro performance letting you get detailed close-up shots.

4K video recording is included. Fans of macro photography will be impressed by the built-in focus stacking, and there are some manual controls available for when you’re shooting.

Best compact camera: Panasonic Lumix LX100 II

Panasonic Lumix LX100 II

The Panasonic Lumix LX100 II offers a multi-aspect ratio sensor, based on a Four Thirds sensor, and combined with a bright f/1.7-2.8 zoom lens with optical image stabilisation, you get a camera that can perform well in low-light shooting situations.

There’s a 3inch touchscreen, but unfortunately this doesn’t tilt. You’ll also find a high-resolution electronic viewfinder (EVF) with 2.76m dots.

The metal bodied camera benefits from a number of external controls and switches, and this makes it a great tactile camera to use, letting you set different settings even when the camera is switched off.

As you would expect with a premium camera, you can record 4K video, and the camera has built-in Wi-Fi so you can transfer images to your smartphone, as well as control it remotely.

Read our Panasonic Lumix LX100 II Long Term Review

We hope that’s been of some help to you. Do you think we’ve left out any other top options? Please feel free to suggest them or ask any questions. Have a look at more buying guides here.

Source link

Posted on Leave a comment

A quality showing by the mid-ranger

Good in zoom for its segment

Positioned in the Oppo Find X3 line between the flagship X3 Pro at the top and the X3 Lite at the bottom, the Oppo Find X3 Neo offers a midrange option with some high-quality features for several hundred dollars less than the X3 Pro. One key difference is that the X3 Neo runs on Qualcomm’s Snapdragon 865 chipset with 5G support instead of the faster Snapdragon 888.

The Find X3 Neo boasts a 6.55-inch, 90-Hz FHD+ AMOLED display.  Let’s see how it fared in our rigorous Display testing protocol.

Key display specifications:

  • AMOLED display
  • Size: 6.55 inches. Dimensions: 159.9 x 72 x 7.99 mm (6.30 x 2.85 x 0.31 in)
  • Weight: 184g
  • Screen ratio: 89.6%
  • Resolution: 1080 x 2400 pixels
  • Aspect ratio: 20:9 ratio, ~402 ppi density
  • Refresh rate: Maximum 90 Hz

About DXOMARK Display tests: For scoring and analysis in our smartphone and other display reviews, DXOMARK engineers perform a variety of objective and perceptual tests under controlled lab and real-life conditions. This article highlights the most important results of our testing. Note that we evaluate display attributes using only the device’s built-in display hardware and its still image (gallery) and video apps at their default settings. (For in-depth information about how we evaluate smartphone and other displays, check out our articles, “How DXOMARK tests display quality” and “A closer look at DXOMARK Display testing.”)

Test summary

A quality showing by the mid-ranger 136Oppo Find X3 Neo

A quality showing by the mid-ranger 137

85

display

Pros

  • Brightness is well adapted in most conditions, especially indoors and in low light.
  • Colors are quite faithful in indoor and low-light conditions.
  • The device feels smooth when browsing the web.

Cons

  • Video contrast differs from the artistic intent of content and impedes viewing pleasure on HDR10 content.
  • Colors, especially skin tones, are altered in bright outdoor conditions.
  • When gaming, touchscreen corners and edges can be difficult to engage.

The Oppo Find X3 Neo’s overall score of 85 puts it just outside the top 20 in our database rankings, tying it with the Apple iPhone 12 and the Huawei P40 Pro. Among the devices we’ve tested in our premium price category, which includes the Apple iPhone 12, it holds up quite well. In this review, we compare the Oppo Find X3 Neo with the Oppo Reno 5 Pro+ 5G, the Samsung Galaxy S21 Ultra 5G (Exynos), and the Google Pixel 5.

A quality showing by the mid-ranger 139

Readability

Samsung Galaxy S21 Ultra 5G (Exynos)

Best: Samsung Galaxy S21 Ultra 5G (Exynos) (74)

DXOMARK uses the device’s gallery app to show static (still image) content when measuring the device’s display for brightness, contrast, gamma, and blue light impact, etc.

Readability is the key measure of a smartphone display’s basic functionality. The Oppo Find X3 Neo scores respectably here among its premium category peers, though it’s 8 points removed from the top score of 74 achieved by the Samsung Galaxy S21 Ultra 5G (Exynos) among all tested devices

When it comes to readability when browsing the web during the day in a low-light environment, the X3 Neo is almost too bright. During the night, however, brightness is suitable, even if it is at the lower limits of acceptability.

The Find X3 Neo’s luminance is close to that of its brand sibling, the Oppo Reno Pro+ 5G, and it nearly matches the Google Pixel 5’s, but it doesn’t reach the heights of the Samsung Galaxy S21 Ultra 5G (Exynos), as you can see in the graph below:

Brightness vs Contrast comparison (30 000 Lux)

The Oppo Find X3 Neo lacks brightness in bright outdoor conditions, but image adaptation to intense lighting does help improve readability. The device adapts its brightness fairly well in changing light conditions, but in shade, the image enhancement is somewhat unstable — one could say a little oversensitive to the ambient lighting.

A quality showing by the mid-ranger 140

Readability outdoors in shade, from left to right: Oppo Find X3 Neo, Oppo Reno 5 Pro+ 5G, Samsung Galaxy S21 Ultra 5G (Exynos), Google Pixel 5

Photo credit: DXOMARK; for illustration only

In direct sunlight, as in the photo illustration below, the X3 Neo clearly tops the Google Pixel 5 in its readability, but doesn’t quite handle the difficult conditions as well as the Galaxy S21 Ultra 5G.

A quality showing by the mid-ranger 141

Readability in direct sunlight, from left to right: Oppo Find X3 Neo, Oppo Reno 5 Pro+ 5G, Samsung Galaxy S21 Ultra 5G (Exynos), Google Pixel 5

Photo credit: DXOMARK; for illustration only

With the blue light filter on, the brightness of the X3 Neo is slightly lower, degrading readability.

As for brightness uniformity, the Oppo Find X3 Neo is quite homogenous.

A quality showing by the mid-ranger 142

Brightness uniformity, from left to right: Oppo Find X3 Neo, Oppo Reno 5 Pro+ 5G, Samsung Galaxy S21 Ultra 5G (Exynos), Google Pixel 5

Photo credit: DXOMARK; for illustration only.

A quality showing by the mid-ranger 143

Color

TCL 20 Pro 5G

Best: TCL 20 Pro 5G (89)

DXOMARK uses the device’s gallery app to show static (still image) content when measuring the device’s display for white point, gamut, uniformity, color fidelity, and blue light filter impact, etc. 

The score of 85 in the color attribute is quite good, placing the Find X3 Neo in the top 10 among all the devices we’ve tested. It’s worth noting that the device does not change its white point in relation to types of ambient lighting.

In indoor lighting conditions, as in the photo illustration below, the colors of the Oppo device were a bit undersaturated.

A quality showing by the mid-ranger 144

Color rendering indoors, clockwise from top left: Oppo Find X3 Neo, Oppo Reno 5 Pro+ 5G, Google Pixel 5, Samsung Galaxy S21 Ultra 5G (Exynos).

Photo credit: DXOMARK; for illustration only.

In outdoor conditions, an orange cast is visible. Skin tones can be altered in an unnatural way by the image enhancement under bright sunlight. 

A quality showing by the mid-ranger 145

Color rendering in sunlight, clockwise from top left: Oppo Find X3 Neo, Oppo Reno 5 Pro+ 5G, Google Pixel 5, Samsung Galaxy S21 Ultra 5G (Exynos)

Photo credit: DXOMARK; for illustration only.

As is typical of smartphone displays generally, the Oppo Find X3 Neo shifts color when viewed at an angle, as the scattering of dots below in the right-hand chart illustrates. The color shifts to pink first then to blue as the viewing angle increases.

A quality showing by the mid-ranger 146

Oppo Find X3 Neo, color at angle

A quality showing by the mid-ranger 147

Oppo Find X3 Neo, color at angle (zoomed in)

As for color uniformity of the display, the device is quite homogenous.

With the blue light filter activated, a noticeable shift to orange occurs.

A quality showing by the mid-ranger 148

Video

Samsung Galaxy S21 Ultra 5G (Exynos)

Best: Samsung Galaxy S21 Ultra 5G (Exynos) (90)

DXOMARK uses the device’s video (or browser) app to show dynamic content when measuring the device’s display for brightness, contrast, gamma, and color.

The Oppo Find X3 Neo’s video score is fairly low, ranking it among other tested devices such as the Samsung Galaxy A52 5G at 63 and the Huawei P40 Pro, also at 61.

The X3 Neo was not bright enough when displaying HDR10 content, which is evident in the photo illustration below. Dark details disappear, and the HDR10 rendering does not function well for some colors. An orange cast is visible on skin tones. The X3 Neo’s contrast is too high on HDR10 content and leads to unnatural rendering of some contents.

A quality showing by the mid-ranger 149

Displaying HDR10 content, clockwise from top left: Oppo Find X3 Neo, Oppo Reno 5 Pro+ 5G, Google Pixel 5, Samsung Galaxy S21 Ultra 5G (Exynos)
                                                                                                                                                                                                Photo credit: DXOMARK; for illustration o

A quality showing by the mid-ranger 150

Motion

Huawei P40 Pro

Best: Huawei P40 Pro (87)

A quality showing by the mid-ranger 151

Touch

OnePlus 9 Pro

Best: OnePlus 9 Pro (83)

The Oppo Find X3 Neo performed well in the motion attribute, measuring up well against some more expensive models like the Apple iPhone 12 Pro, which had an 81. It showed a few stutters at both 30 and 60 fps. No frame drops were visible when playing games. As for motion blur, the device appeared sharp. When it came to video playback reactivity, the X3 Neo pauses before resuming play. Some jerkiness was evident on 60 fps content.

A quality showing by the mid-ranger 152

Oppo Find X3 Neo, frame drops at 30 fps

A quality showing by the mid-ranger 153

Oppo Find X3 Neo, touch accuracy

As for touch, despite having a capped zoom, the X3 Neo was fairly accurate and pleasant to use in the gallery app; the photo above right illustrates its zoom accuracy. When gaming, however, the corners of the display were not very reactive at all.

In terms of smoothness, the X3 Neo is quite smooth when browsing and in the gallery app, but it’s not smooth when gaming.

A quality showing by the mid-ranger 154

Artifacts

Huawei P50 Pro

Best: Huawei P50 Pro (86)

The Oppo Find X3 Neo scored well in the artifacts attribute. The notch in the top left of the screen may hide some content. Flicker is well managed and no judder was visible at 30 or 60 fps, although some slight judder was visible at 24 fps.

The device was vulnerable to ghost touches. Long touches in the center of the screen were not well detected when the palm was in contact with the display edges, for example.

As for aliasing, it was visible when gaming, as you can see in the image and zoomed-in crop below:

Oppo Find X3 Neo, aliasing (illustration)

A quality showing by the mid-ranger 156

Oppo Find X3 Neo, crop: some visible aliasing

Conclusion

The Oppo Find X3 Neo puts in a solid overall display performance in this midrange tier of smartphones. With particularly strong scores in artifacts, motion, and color, the device shows that the strengths of the X3 line aren’t limited to the top-shelf X3 Pro. With a bit more brightness and a better viewer experience when watching HDR10 content, the X3 Neo would really be punching above its weight.

Source link

Posted on Leave a comment

Speed and quality in one

Speed and quality in one

The Sony Alpha 1 is the new Sony full-frame mirrorless flagship, sitting above the Sony A9 and the Sony A7 models. Housed in a familiar-styled and still relatively compact body, the Sony A1 features a completely new 50 MP “stacked” BSI CMOS sensor capable of 30 fps bursts (lossy compressed RAW/JPEG only), blackout free, and captures 8K/30p video.

Besides the impressive sensor, it has built-in sensor stabilization, which Sony claims can reduce camera shake by up to 5.5 steps, plus there’s also multi-shot capability of outputting a 199 MP image. Other high-end features include a 9.44 million dot OLED viewfinder with impressively large 0.9x magnification, and a tilting 3.0”, 1.44M dot touch-sensitive LCD. On a camera destined for press and studio use, the electronic shutter can now sync with flash and can do so at up to 1/200s. If a higher sync speed is required, the mechanical shutter can sync at up to 1/400 second.

Besides the pro-oriented stills specs, the Sony A1 also has a full complement of video capabilities, including full-width 8K/30p and full-width 4K/60p (though the latter is binned, not oversampled, which accounts for the higher frame rate). It can also shoot 4K 120p from a slightly cropped region, and Sony promises a 4.3K 16-bit RAW option over HDMI in the future.

The camera has two CFexpress (Type A)/SD UHS-II card slots. There’s also full-size HDMI and USB Type C for super-fast transfer speeds or external charging, plus there’s Wi-Fi, Bluetooth and Ethernet for additional camera control and image sharing options.

Key specifications

  • 50.1 MP full-frame stacked BSI CMOS sensor
  • 5.5-stop, 5-axis image stabilization
  • Fast hybrid AF with 759 points, Real-time Eye AF
  • Native ISO 100-32,000, with expansion to ISO 50-102,400
  • 12 fps mechanical, 30 fps with electronic shutter
  • 9.44-M dot viewfinder, 240 fps refresh
  • 3.0” touch-screen LCD, 1.44 M dots
  • 8K 30p, 4.3K 16-bit RAW (over HDMI), 10-bit 4K 120p
  • Dual band (2.4/5 GHz) Wi-Fi, Bluetooth, USB-C, 1000BASE-T Ethernet
  • Dual CFexpress (Type A)/SD slots (UHS-II compatible)

Overall performance

Speed and quality in one 157Click on the score chart above to open the Sony A1 product page.

The Sony A1 achieved an overall sensor score of 98, which puts it in 9th position in our overall sensitivity ranking (including medium format) and in 7th place for full-frame 35 mm. That places the Sony A1 fractionally behind the Sony A7R IV and Nikon Z7 II at 99 and 100 respectively, and just slightly in front of the Canon EOS R5 at 95 points by less than 0.2EV in overall sensitivity. There’s really not much between them on the face of it, but it’s always worth looking at the individual metrics and how the sensors perform throughout the sensitivity range.

The Sony A1 sensor has excellent maximum color depth and dynamic range at base, measured at 25.9 bits and 14.5 EV ,respectively. While still very good, the sensor doesn’t compare quite so well in our low-light ISO (Sports) category, however, where it achieved a computed value of ISO 3163. Given the pixel count (and smaller pixel size), though, it is still impressive.

In-depth comparisons

As this is a camera aimed at working professionals, news agencies, and well-heeled enthusiasts, we’ve compared the Sony A1 with the 20 MP Canon EOS 1 DX Mark III and the 24 MP Leica SL2 S. We’ve also chosen these to compare because the Canon adopts a high-end but conventional CMOS sensor design, now known as a front-side illuminated type, while the Leica has also has a BSI-type sensor (though not the more advanced “stacked” type found in the Sony A1).

Portrait (color depth)

The Sony A1 has a very strong response for color sensitivity, especially at low ISOs, up to ISO 200, where it peaks at 25.9 bits at ISO 50/ISO 100 (overlaid on the graph), compared with the maximum 25.2 bits recorded by the Leica SL2-S. There’s also slight bump of around 0.5 bit between ISO 12,800 and ISO 25,600, though otherwise there isn’t much between them, which is pretty remarkable given the difference in pixel count.

For a conventional front-side CMOS sensor, the Canon EOS 1DX Mark III performs very well at high ISOs, practically matching the A1 from ISO 3200 and up, but it doesn’t compare quite so well at lower ISOs up to ISO 200, nor between ISO 800 and ISO 1600. In fairness, though, the difference there is minimal.

Images from the Sony A1 remain within our high quality range when set at ISO 1600 (measured ISO 1175), and thanks to a second gain in the response, they stay comfortably above the 14-bit threshold (measured at 15 bits) at ISO 25,600 (measured ISO 17859). However, so do both the Leica and the Canon, though the latter records exactly 14 bits at at ISO 25,600 (measured ISO 18,482).

Speed and quality in one 158

Landscape (dynamic range)

At base ISO, the stacked sensor in the Sony A1 peaks at around 14.5 stops of dynamic range, which is close enough to parity with the very best results from full-frames at around the 14.7 EV mark. Even against the impressive BSI CMOS in the Leica SL2-S, the Sony A1 has around +0.5 EV more at base. While that’s a small amount, it may give the A1 a slight edge when adjusting highlights and shadows in high-contrast scenes, such as dimly lit interiors with brightly lit windows.

Although the Sony A1 beats the Leica SL2-S, the conventional FSI CMOS in the Canon EOS-1DX Mark III has the same 14.5 EV dynamic range at base, and the Canon trumps both rivals up to ISO 800. After that point, though, a gain in the response of the Sony and Leica put them both on a similar trajectory as the Canon, with the DR diminishing by around -1 EV per whole ISO step — up to ISO 102,400, anyway (although the Canon can go higher).

A closer look reveals the BSI CMOS in the Leica SL2-S performs slightly better than either the FSI Canon or the stacked Sony, and from ISO 3200 on, the Leica and the Canon have a slightly wider DR than the Sony, around +0.3-0.5 EV. Interestingly, a second gain in the response of the Sony A1 between ISO 12,800 and 25,600 increases DR to the same level as the Canon and Leica, but it falls back by the same amount at ISO 102,400.

Speed and quality in one 159

Sports (low-light ISO)

Given the Sony A1’s much higher pixel count, it isn’t really surprising to see slightly noisier images reflected in the Sony’s Sports score. However, what is perhaps surprising is that at low ISOs, the Sony is cleaner than the Canon and only marginally noisier than the Leica. At higher ISOs, the differences between the results reduce as the ISO sensitivity increases, with the Sony A1 only slightly noisier than both at our 30 dB quality threshold. That said, the +0.15EV difference between the Leica SL2-S (3163 ISO vs 3504 ISO) is barely noticeable, and it’s negligible when compared to the Canon EOS-1DX Mark III (3163 ISO vs 3248 ISO). At least that’s the case when output is normalized to 8 MP (equivalent to a 12×8 inch print at 300 dpi); when viewed on-screen, the difference in noise levels is more apparent.

Speed and quality in one 160

Conclusion

After the introduction of the Sony A9 it seemed that Sony had established a flagship series; with the appearance of the Sony A1, however, that’s clearly not the case. Indeed, on paper the Sony A1 combines the best of the A9, A7R, and A7S series in one package, and naturally that doesn’t come cheap.

The Sony A1 has a high-resolution 50 MP sensor that produces exemplary images and is practically identical in performance to existing high-end sensors which feature far lower pixels counts, such as those found in the superb Leica SL2-S and in Canon’s equally superb flagship EOS-1DX Mark III. Combined with the wide range of available native-mount lenses, its sensor makes the Sony A1 a highly attractive option for professionals working in a wide range of genres and markets, and will no doubt be on the wish lists of a tranche of enthusiasts and amateurs alike.

In this review, we have mentioned the Sony A1 most relevant rivals from other brands. As usual, you can compare it with these and with other models and create your own comparisons and in-depth analyses using our interactive image sensor ranking tool. 

Source link

Posted on Leave a comment

Fujifilm GFX 100s Versus Sony Alpha 1: Image Quality Comparison

Fujifilm GFX 100s Versus Sony Alpha 1: Image Quality Comparison

For those of you that enjoy extreme levels of pixel peeping, you’re in for a treat.  A recent video compares the image and video quality between the Sony Alpha 1 and the Fujifilm GFX 100s. Although both cameras sit in slightly different categories, it’s interesting to see how both camera systems compare against one another. 

Fujifilm with its GFX series of cameras has single-handedly made medium format more affordable than ever before. It wasn’t long ago when a 50 MP medium format camera would cost more than $20,000. You can now purchase a 16-bit capable 100 MP medium format camera for less than some flagship full frame cameras. Full frame cameras such as the Sony Alpha 1. With its price tag of $6,498, the Sony Alpha 1 costs almost $500 more than the Fujifilm GFX 100s camera. Of course, both of these cameras sit in different categories and in many respects are aimed at different kinds of photographers. Nonetheless, if you’re only looking at sheer image quality the less expensive option could be the better option. 

In the video linked above, Gordon Laing compares the image and video quality from both camera systems. What’s most interesting is how closely both cameras perform against one another, with each beating the other in certain categories. 

To see how both camera systems perform, check out the full video linked above. 

Source link