Posted on Leave a comment

Canon reveals ‘light in the dark’ competition winner

Canon reveals 'light in the dark' competition winner

Canon has announced the winning image from its Light in the Dark themed Redline Challenge competition, chosen from over 36,000 entries.


Canon has announced the winner and shortlisted photographs from the inaugural Redline Challenge photography competition. Chosen from 36,195 images, Piotr Skrzypiec’s ‘Lost Highway’ won and as a result, Piotr will receive over €14,000 worth of Canon kit including the EOS R5 and RF lenses along with a personal photography assignment and mentoring session with judge and Canon Ambassador, Lorenz Holder.

It’s hard to describe how it feels” said Piotr, “it is both a big surprise and a huge achievement, especially as when I entered, I submitted three pictures and discovered that all of them had been shortlisted.”

Canon reveals 'light in the dark' competition winner 1

Winner: Lost Highway by Piotr Skrzypiec – Skofljica, Slovenia

Alongside the winner, Canon also revealed the shortlisted photographers that included four photographers from the UK and Ireland. The shortlisted photographers were selected on their ability to create technically and interpretively original photographs when reimagining the Light in the Dark theme.

Shortlisted photographers:

  • Andrée Letendre
  • Curtis Walsh
  • Goran Loncar
  • Kiko Ruiz Lloret
  • Matteo Bertaggia
  • Michael Zech
  • Mika-Nikolas Mahringer
  • Morne Laubscher
  • Pierre-Emmanuel Samson
  • Sally Heaphy
  • Sylvain Cochard
  • Tadej Žlahtič
  • Tara Keane
  • Victor Madelaine
  • Yanissa Geerts
Canon reveals 'light in the dark' competition winner 2

Shortlisted: Sally Heaphy – Humble, UK

Canon reveals 'light in the dark' competition winner 3

Shortlisted: Goran Loncar – Dublin, Ireland

Canon reveals 'light in the dark' competition winner 4

Shortlisted: Curtis Walsh – Quendon, UK

Judge and ITCG European Marketing Director for Canon EMEA, Susie Donaldson added: “As a judge of the Redline Challenge, I was blown away by how many unique interpretations of the theme there were and the technical ability of the entrants.

There were a number of incredible photos but the winning image, Lost Highway, had such impact we just kept coming back to it. It really reflected the theme and the times we are in right now, perfectly capturing the idea of emerging from something, taking a new turn into hopefully more positive times.”

Learn more about the Redline Challenge competition: Redline Challenge: Photography Competition


Further reading

Canon EOS R6 wins Camera of the Year in the Amateur Photographer Awards 2021

Canon launches compact RF 14-35mm F4L IS USM wide zoom

Source link

Posted on Leave a comment

A Quest for the Dark Side and Better Astrophotography

A Quest for the Dark Side and Better Astrophotography

One of the Holy Grail quests for astrophotographers is the search for dark skies. Few of us are fortunate enough to live in ideal dark skies, but most of us are mobile enough to get to somewhere better than the center of an urban area.

In 2006, John Bortle published an article in Sky and Telescope describing an informal scale for rating your skies, now appropriately known as the Bortle scale. On his scale, 1 is best, 9 is worst. Bortle 9 is what I live under — I don’t bother with a flashlight when I go out in my backyard at midnight.

A Quest for the Dark Side and Better Astrophotography 5

A Quest for the Dark Side and Better Astrophotography 6

The contrast-enhanced shot above was taken after midnight with no moon in the sky. In person, only a couple of stars were visible when I took the shot. On the processed shot, I’ve circled the three bright stars of the prominent Summer Triangle. The other bright object on the left side of the image is Jupiter.

A Quest for the Dark Side and Better Astrophotography 7

My local solution is a 100 mile (161 km) drive to my observatory at an elevation of 4,300 feet (1,310 meters), after which I’m under skies that are perhaps Bortle 4+ skies on a good night. The shot above was taken towards the southwest, where the glow of San Diego dominates the center horizon and the glow of the nearby town of Temecula and more distant Los Angeles begins on the right.

A Quest for the Dark Side and Better Astrophotography 8

The cover photo (repeated above) was shot under first quarter moonlight in a site in San Pedro de Atacama at elevation 7,900 feet (2,407 meters) in northern Chile, which probably would be classified as a very good dark site (perhaps Bortle 1+). Despite a first quarter moon, the Milky Way is clearly visible. The volcanic peak in the background is Licancabur which is on the border between Chile and Bolivia. The top of the cone is 19,409 feet (5,916 meters).

A Quest for the Dark Side and Better Astrophotography 9

Above is a mosaic shot at (by far) the darkest site I’ve ever visited (Namibia). It is situated on a high, dry plateau at the edge of the Kalahari desert with superb conditions for astrophotography, but is a real journey to reach. The image of the southern Milky Way is a 5-panel mosaic of 40-minute exposures on medium format film.

A Quest for the Dark Side and Better Astrophotography 10

A more accessible site is Haleakala on the island of Maui. At an elevation of 10,023 feet (3,055 meters), the air is very transparent and steady, but as you can see in the image above, light pollution is clearly visible nearby. Tourism-oriented resorts and businesses outline the island shores on the left and right sides, with the central glow coming from the towns of Wailuku and Kahului, where the airport is located. On one hand, the top of the volcano is easily accessible by ordinary vehicles on a wide, paved road and is a national park. On the minus side, the peak is now so crowded that the National Park Service requires reservations to view the sunrise.

Jumping Into the Search

To aid in finding a suitably dark site, there are now several light pollution maps available on the web, as well as recommended lists of public locations with dark skies. The IDA (International Dark Sky Association) is also a source of information if you want recommendations for reducing the light footprint of your own properties.

While perusing the light pollution maps may be helpful, it’s prudent to remember that like a map of the average cloudiness of the sky, these are averages. Like the actual weather, the local conditions at a particular location are highly dependent on several factors:

  • Brightness of lights in your immediate vicinity
  • Lighting technologies
  • Air pollution (atmospheric scattering)
  • Altitude

The first point is the obvious one, which first comes to mind. Nearby lights can directly shine into your lens, causing reflection artifacts or affect your night vision. Seasonal effects include wind, fires, and fog. Holiday lighting is increasingly becoming a source of light pollution as cheap strings of LED lights are available.

A more subtle issue is the average glow of distant towns or cities, annoying especially for landscape astrophotographers. Even for deep sky photography, these distant light domes limit the direction and minimum altitude of shooting. Wide angle shots are particularly affected, with distinct gradients annoyingly spanning the photo. But even these effects vary. At my observatory, at certain times of the year, low coastal fog smothers the light domes of surrounding cities, improving the Bortle rating considerably.

Lighting Technologies

The type of lighting also plays a significant role in the severity of light pollution effects. Many older types of lighting are in specific spectral bands, allowing at least the possibility of using filters to block some of the interference. Unfortunately, from the astronomers’ viewpoint, the ugly low-pressure sodium streetlight spectrum was the easiest to block, but it was such a monochromatic orange hue that it was hard to locate your car in a parking lot!

With the widespread availability of low-power LED lighting, many lights have been switched over to take advantage of the reliability and low-cost benefits. Unfortunately, to encourage the switch to LED lighting, manufacturers engineered bluer, more natural lighting, and in doing so, we have shot ourselves in the foot. LEDs are naturally very narrow-band sources of light, but phosphors have been added to absorb and re-emit the light to cover a wider spectrum. In doing so, we’ve managed to swing the light pollution spectrum towards the blue, which scatters in our atmosphere more than lighting with a redder color, as described in a recent study.

Air Pollution (Atmospheric Scattering)

The blue light scattering problem also raises the role played by particles in the air (whether considered pollution or not). The light sources, by themselves, would not be such a problem if the light did not have a way to scatter and bounce back at us. Smoke and urban smog are the most obvious contributors, but moisture and wind-whipped dust can also subtly affect contrast in our images, even though distinct layers of haze may not be obvious to the eye.

In the daytime, you can get an idea of how much of a problem scattering is for you by blocking the sun and seeing how blue the sky looks as you get close to the sun. Ideally, the sky will look dark blue right up to the edge of the sun. At night you can do the same test with the moon. At my observatory site, the sky can often look clear to the eye, but as soon as something bright like Venus or Jupiter rises, it becomes readily apparent that sky haze is present. In long exposures, large haloes (not related to chromatic aberration) become visible.

Altitude

To get around this problem, a solution is to get higher in altitude to get above the low-lying air pollution as well as clouds. With thin and clear enough air, it’s possible to photograph the Milky Way even when the Moon (the worst natural light pollution source) is out. But even this may not be a good fix if a global event such as a large volcanic eruption has put ash high into the atmosphere. Your personal sensitivity to high altitudes may also limit this option.

A Quest for the Dark Side and Better Astrophotography 11

The Simons Observatory (above) in northern Chile sits at 17,000 feet (5,182 meters), with air clear enough to see the Milky Way even with a first quarter moon in the sky.

Aircraft

Another often ignored source of light pollution concern is aircraft traffic. These cause a double-hit — light pollution as well as air pollution. Aircraft traffic exists at all hours of the day and night. At night, they are flying with bright navigation lights. A good strategy is to check not only light pollution maps, but also aircraft flight path maps, and choose a location appropriately.

A Quest for the Dark Side and Better Astrophotography 12

In addition to the navigation lights of aircraft, engine exhaust is being emitted at high enough altitudes to linger for long periods, often in the form of visible contrails (high-altitude ice crystals). 

What about satellites? For amateur astronomers, they are not a problem. They are much dimmer than aircraft, have no navigation beacons, and the low-flying ones are mostly visible near sunset or sunrise. For professional astronomers, they could become a problem, but amateur astronomers have larger problems to contend with.

Other Astronomers and Astrophotographers

And finally, I have to say that sometimes we are our own worst enemies. When we’re out shooting our own astrophotos, we need to keep in mind that 50 meters away, another astrophotographer may be trying to do his own thing. Lighting up the landscape with your flashlight may interfere with the next person’s shot. Your cellphone or camera rear screen may be as bad. Even the self-timer countdown flasher or memory card write light could be a problem, so have some black tape handy to suppress these sources of light.

A Quest for the Dark Side and Better Astrophotography 13

As an astrophotographer, it’s also a good idea to avoid groups of amateur astronomers doing visual astronomy. They will often have groups of people with flashlights pointing everywhere, including directly at your camera. Green laser pointers are often a problem too and are bright enough to be picked up in photos (look closely at the image above) even when moved around. For this reason (as well as for eye safety), I strongly discourage the use of them as polar “finders” or pointers to targets.

Have you got a good location to recommend? Please add your comments below!

Source link

Posted on Leave a comment

Scientists Photographed Our ‘Galactic Bulge’ Using a Dark Energy Camera

Scientists Photographed Our 'Galactic Bulge' Using a Dark Energy Camera

Scientists Photographed Our 'Galactic Bulge' Using a Dark Energy Camera 14

In an effort to research how the center of the Milky Way Galaxy formed what is known as a “galactic bulge,” Scientists used a Dark Energy Camera to survey a portion of the sky and capture a photo of billions of stars.

NASA’s Hubblesite describes our galaxy as “shaped like two fried eggs glued back-to-back.” This depiction makes clear the central bulge of stars that sits in the middle of a sprawling disk of stars that we usually see in two-dimensional drawings. You can get a better idea of how that looks thanks to a rendering from the ESA below:

Scientists Photographed Our 'Galactic Bulge' Using a Dark Energy Camera 17

This makeup is thought to be a common feature among myriad spiral galaxies like the Milky Way, and scientists desired to study how the bulge was formed. Were the stars within the bulge born early in our galaxy’s history, 10 to 12 billion years ago, or did the bulge build up over time through multiple episodes of star formation?

“Many other spiral galaxies look like the Milky Way and have similar bulges, so if we can understand how the Milky Way formed its bulge then we’ll have a good idea for how the other galaxies did too,” said co-principal investigator Christian Johnson of the Space Telescope Science Institute in Baltimore, Maryland.

The team surveyed a portion of our sky covering more than 200 square degrees – an area approximately equivalent to 1,000 full Moons – using the Dark Energy Camera (DECam) on the Victor M. Blanco 4-meter Telescope at the Cerro Tololo Inter-American Observatory in Chile, a Program of NSF’s NOIRLab.

Scientists Photographed Our 'Galactic Bulge' Using a Dark Energy Camera 20
This image shows a wide-field view of the center of the Milky Way with a pull-out image taken by the DECam.

The scientific sensor array on the DECam is made up of 62 separate 2048×4096 pixel backside-illuminated CCD sensors, totaling 520 megapixels. An additional 12 2048×2048 pixel CCD sensors (50 megapixels) are used to guide the telescope, monitor focus, and help with alignment.

This wide-field camera is capable of capturing 3 square degrees of sky in a single exposure and allowed the team to collect more than 450,000 individual photographs. From that data the team was able to determine the chemical compositions for millions of stars. The image below contains billions of stars:

Scientists Photographed Our 'Galactic Bulge' Using a Dark Energy Camera 23

You can view a pannable and zoomable version of this image here. It uses the same interface as the giant 2.5 gigapixel image of the Orion Constellation taken by Matt Harbison.

For this particular study, scientists looked at a subsample of 70,000 stars from the above image. It had been previously believed that the stars in the bulge were born in two separate “waves” early in the history of the galaxy, but thanks to data gleaned from the study, now scientists think that a vast majority were formed at about the same time nearly 10 billion years ago.

According to Nasa, the researchers are looking into the possibility of measuring stellar distances to make a more accurate 3D map of the bulge. They also plan to seek correlations between their metallicity measurements and stellar orbits. That investigation could locate “flocks” of stars with similar orbits, which could be the remains of disrupted dwarf galaxies or identify signs of accretion like stars orbiting opposite the galaxy’s rotation.

(Via Hubblesite and SyFy)

Source link

Posted on Leave a comment

Behind the Scenes of Apple’s “Dark Universe” Video Shot on iPhone 12 Pro

Behind the Scenes of Apple's "Dark Universe" Video Shot on iPhone 12 Pro

In a video commissioned by Apple, Donghoon and James of Incite Design show off some incredible visuals captured by the company’s latest smartphone, the iPhone 12 Pro. The two show how they did it in this 5-minute behind-the-scenes explanation.

“We were enamored with the idea of trying to create a fictional universe,” Donghoon said. “In the past, we weren’t able to get the look that we wanted. I really had to fight the darkness.”

Donhoon continued, “We had to contend with, ‘How do you film darkness?’ What are the parts that build up this fictional universe?”

“With the iPhone 12 Pro, we were able to shoot so much better in low light,” James said.

The duo used a combination of plasma, different colored lasers, as well as different materials to produce a mix of visual effects. The behind-the-scenes video also shows how the two used different liquids to produce a flowing “clouds” effect.

The number of different ideas, machines, and techniques that Donhoon and James talk about in this video come in at a rapid-fire pace, and perhaps more impressive than the final visuals are the interesting ways that the two created desired looks. In one clip they show how they dropped the phone directly into rocks that they had fired upwards using a piston, and in another, they use magnets and iron filings with the camera very close to the surface. One step further, they use ferrofluid to create rapidly flowing ripples that flow wildly on camera.

The final video is definitely worth your time to watch after seeing the methods the two used to create it:

If you have ever been curious about how creators make some of the most interesting practical effects, even this short video will hit you with a large number of outstanding ideas that are worth considering. Though Apple warns “do not attempt” in the behind-the-scenes look, it’s likely more to prevent someone from damaging their phone than it is a warning about experimenting with different materials and lighting.

If you’re interested in what Incite Design has done here, you can follow them on Instagram for more examples of their outstanding effects work.

(Via ISO 1200)

Source link

Posted on Leave a comment

Sony’s long-awaited A7S III is built for shooting high-res video in the dark

Sony’s long-awaited A7S III is built for shooting high-res video in the dark


Sony's A7S III got some slight design tweaks from its previous version.

Sony’s A7S III got some slight design tweaks from its previous version. (Sony /)

Since its debut back in 2014, Sony has built its A7S line of cameras specifically for absurd low-light performance. This week, the company announced the most recent installment, the A7S III. Like its predecessors, it offers just 12 megapixels of resolution on a new custom-built sensor. And while it won’t be winning any megapixel wars—especially against cameras like Canon’s recently announced 45-megapixel EOS R5—Sony’s latest offering looks like a monster when it comes to video and shooting in the dark.

Here are some of the highlights from one of Sony’s most impressive cameras yet.

A new custom sensor

From a hardware standpoint, Sony has made some notable changes regarding the chip that actually handles the light collecting in the A7S III. The resolution remains at 12 megapixels, just like the previous iterations, which gives the photo sites lots of room to spread out. Bigger photo sites can collect more light before hitting their limits, which typically translates into cleaner images shot in low light. You can push the A7S III’s ISO setting all the way up to 409,000, but we’ll have to wait for production samples to see how usable they remain before too much digital noise creeps in.

The sensor is now backside-illuminated, which is a structural change regarding the actual sensor assembly. BSI chips typically also improve low-light performance, which is why you often see them in tiny smartphone sensors that struggle with digital noise. I wouldn’t expect the switch to BSI to make an enormous difference all on its own, but if you’re building a camera to shoot in the dark, it’s a logical jump to make.

That 12-megapixel resolution has other functions beyond keeping the pixels large—it’s also specifically good for shooting video. The A7S II can shoot “native” 4K footage, which means it’s using essentially the entire sensor on a pixel-by-pixel basis to shoot 4K video. Other cameras with higher resolution sensors typically resort to “pixel binning,” which involves grouping pixels together to act as a single pixel to make up for the resolution disparity. Other manufacturers simple crop into the sensor and only use an area in the center that’s large enough to produce a 4K image. That’s not ideal because it changes the view from your lenses and makes it difficult to capture wide-angle shots.

Sony considers this a hybrid camera and it will certainly shoot beautiful stills, but 12 megapixels for still shooting feels low here in 2020, especially considering that cameras with smaller sensors such as the Fujifilm X-T4 offer more than double. And while resolution isn’t everything, even a 5K monitor—which are readily available on the market right now—checks in at nearly 15 megapixels at its native resolution, which already outpaces the A7S II’s native-pixel count. The images will still look beautiful if they’re well-captured, but as we move toward 8K displays, resolution matters.

Dedicated white balance sensor

The touch screen now plays a more important role in navigating the menus.

The touch screen now plays a more important role in navigating the menus. (Sony /)

Cameras constantly monitor all kinds of variables in a scene, one of which is color temperature. Sony equipped the A7S III with a dedicated color-temperature sensor on the outside of the body. That’s atypical for high-end modern cameras. Sony says it will help prevent odd color shifts during video shooting when something suddenly pops into the frame and changes the overall tones within the scene.

New menu system

Sony’s menu systems have received ample, well-deserved criticism through the history of the A7 line. They’re somewhat difficult to navigate and the arrangement can be downright confusing in certain cases. Now, however, Sony has revamped its menu system to make common functions easier to find in a hurry. The A7S III employs a rotating touch screen for poking through the menus. It looks promising—and a whole lot more modern—compared to the previous version.

Dual-format card slots

Professional shooters typically want two card slots in a camera. Cards fail and having a backup can be a lifesaver. Sony put a pair of card slots in the A7S III, but each slot can accept two different kinds of memory cards: UHS-II SD and the newer CF Express Type A. Other manufacturers such as Canon sometimes mix up card formats in the same camera. The EOS R5, for instance, has both an SD card slot as well as a slot for CFexpress cards. It’s less flexible than Sony’s hybrid option.

The kind of card you actually need depends on what sort of video footage you’re hoping to capture. If you’re trying to max out resolution, bit rate, and frame rate all at the same time, you’re going to need screaming fast memory just to keep up. If you’re trying to shoot 10-bit XAVC HS at 120 fps and 280 Mbps, speed is essential. If you don’t know what any of that means, you’re probably OK dialing down the quality and sticking with typical cards, at least for the moment.

Cooling for longer recording

The Canon EOS R5 made a big splash with its 8K video recording modes. Since then, however, controversy has emerged from the camera’s tendency to overheat after a period of time. That’s due in part to oversampling its 4K footage with that big high-res sensor. The Sony avoids that issue with its 12-megapixel chip.

Cinema cameras like those used on big movie sets typically have internal cooling systems that include fans to help displace heat that comes from hardworking components inside. The A7S III doesn’t have any fans, but it does have passive cooling material to pull heat away from the critical components. That allows it to record for longer consecutive bursts without needing a break.

Sony has had some trouble with overheating in the past, especially if you’re shooting outdoors in the sun. But, the company claims some considerable improvements here, which should mean more uptime.

Two card slots can both accept SD UHS-II or CFexpress cards.

Two card slots can both accept SD UHS-II or CFexpress cards. (Sony /)

Lots of video recording options

If you’re not plugged into the latest and greatest video recording formats, the A7S III’s spec sheet may look impossible to parse. It offers many of the common high-bitrate video formats professionals want when shooting on productions. When it comes to 4K, it offers XAVC S (H.264) and XAVC HS (H.265), both at various frame rates, bit depths, bit rates, and sampling rates. There are more options and you can dig into them on the official spec list, but it’s suffice to say that it’s beastly when it comes to recording modes. It can even pass 4K/60 footage at 16-bit depth to an external recorder if you really want to max things out.

How does it compare?

In terms of competition, the $3,500 price tag puts it in the same conversation as Canon’s EOS R5—but the two are really very different cameras. A better comparison lies in the Panasonic Lumix DC-S1H, which costs $500 more, but offers a 24-megapixel sensor, higher-resolution rear screen, and 6K Raw video output to a recorder.

We’re eager to see how the low-light and video performance will look once production models are available. For now, however, the $3,000 to $4,000 segment of the camera market is as exciting as it has been in a long time. It’s also vastly more interesting than cheaper segments of the market.

Source link