Anyone still using "lossless compression" image quality with Z9?

If you would like to post, you'll need to register. Note that if you have a BCG store account, you'll need a new, separate account here (we keep the two sites separate for security purposes).

What version are you using ? I sent them an e mail since the online info from them was till not showing HE* supported ?
I am using Version 6.0, build 7212 (90865b3) I think the issue is that I am using it on a PC and Nikon provided the codec to install on a Windows PC to read the HE files. I believe that Photo Mechanic is likely using that codec as well.
 
compression algorithms can be more efficient than post processing tasks that require exposure, white balance decisions & format conversions. Some compression algorithms have as little as 1-2% impact on performance. I'm aware we are talking cameras here, but I work with enterprise class high performance storage arrays with bleeding edge solid state hardware. The data compression and data reduction algorithms are so efficient now that I can allocate over one Petabyte of capacity to clusters of Database servers , yet actually use under 300TBs on the physical nvme-SSD storage arrays while maintaining top tier performance (i have labs to test and have seen zero penalty with data compression enabled vs disabled ). So these technologies have come along way, and are no longer considered a limitation. I would think compression technologies used at image pipelines are designed in similar manner to achieve high efficiency and performance. Just my two cents... Nikon, give us HE Raw pre capture please

Yes, I have as well. The buffer performance is so much better using HE*. I did have to wait a bit with my Z50ii until LrC support HE* for the Z50ii. I just think that it will become more the standard when respected shooters like Steve acknowledge that they are using it.
I'm not so sure. Both are done in hardware in the still image stream pipeline, and let the live view stream pipeline take care of the EVF or the back display.

Producing HE* in the pre-release capture wouldn't require the post-processing, but the HE* compression algorithm (or HE - it's just a parameter) is a little more demanding than JPEG. Since the post-processing is rather trivial, it's not clear which one would consume more power.

Either way, it shouldn't be an overhead for the EXPEED 7, so I'm not sure why it's not an option. Actually, my understanding was that only the burst modes (30-120 fps) were limited to JPEG in pre-release capture, not other framerates.
 
According to their website, FastRawViewer supports HE and HE* files. I don't shoot Nikon and don't have any HE or HE* files, but if someone wants to link me to one, I can confirm it'll open it. For the software itself, it doesn't have lots of frills, but it's highly customizable and it opens the raws (and views the raws, not embeded jpgs) very quickly.
 
I am using Version 6.0, build 7212 (90865b3) I think the issue is that I am using it on a PC and Nikon provided the codec to install on a Windows PC to read the HE files. I believe that Photo Mechanic is likely using that codec as well.
I am on apple OS and the is probably why my Photo Mechanic Version 6.0, build 7212 (90865b3) also did not work in the past with HE* ditto with Apple preview and photos. Will have to check again just in case.
 
Either way, it shouldn't be an overhead for the EXPEED 7, so I'm not sure why it's not an option. Actually, my understanding was that only the burst modes (30-120 fps) were limited to JPEG in pre-release capture, not other framerates.
Pre release capture is only available with High speed capture rates of 15, 30, 60, and 120 fps and captures jpeg images only. C15 was added with firmware 5.0 on the Z9.

 
I did a quick, unscientific test with lossless and HE* modes and the shutter rate started to decrease with lossless after about 5-8 seconds, where it kept on going consistently with HE*. This was with a Z9 at 20 fps with Pro Grade Cobalt 325 GB CF Express card.
 
compression algorithms can be more efficient than post processing tasks that require exposure, white balance decisions & format conversions. Some compression algorithms have as little as 1-2% impact on performance. I'm aware we are talking cameras here, but I work with enterprise class high performance storage arrays with bleeding edge solid state hardware. The data compression and data reduction algorithms are so efficient now that I can allocate over one Petabyte of capacity to clusters of Database servers , yet actually use under 300TBs on the physical nvme-SSD storage arrays while maintaining top tier performance (i have labs to test and have seen zero penalty with data compression enabled vs disabled ). So these technologies have come along way, and are no longer considered a limitation. I would think compression technologies used at image pipelines are designed in similar manner to achieve high efficiency and performance. Just my two cents... Nikon, give us HE Raw pre capture please
I make chips that do both. Image compression, which is completely different from data compression, requires a lot of memory accesses, both for synchronization and buffering, which is what consumes most power in a chip. Or, if we count the number of gates, they're in a completely different ballpark.

Post-processing of that nature is usually a very straightforward pipeline in hardware that only involves local operations (often mults and adds, sometimes a LUT) which can be performed on the pixels as they stream by. In comparison, compression is very efficient, too, but the paths are more complicated and involve many different operations that have different timings. You need to do the debayering, convert the spatial pixels into frequency data (wavelets), perform an analysis for the compression and sort out what has to be left out while maximizing the quality (that one is very limiting!), quantize the data, perform arithmetic encoding, and format the stream output. It's quite hard to balance the lengths of all parts of the pipeline to preserve the top performance while not using too much memory, especially at high quality compression rates (the ones used in the HE modes) because they output quite a bunch of data whose size can vary wildly from one area to the next. It's a problem common to all compression algorithms I've seen.

But I don't know exactly what they're doing in their post-processing. It's true that their demosaicing - Nikon's demosaicing used for the post-processing and JPEG compression - is very good, so it must be more complex than just applying a simple interpolation.

Those operations are performed in a separate pipeline in their chip, so as you said, there's no apriori reason not to provide the choice in precapture. Except, as @JAJohnson clarified, pre-release capture is only available for 30/60/120 FPS modes, which can't be achieved in lossy raw compression because of the reasons given above.

So it's a problem of raw performance (pun intended), not a problem of CPU/ISP overhead. We could, theoretically, get a pre-release capture mode at 20 FPS if it were available for other modes than high-speed capture, but for some reason, it's not. Maybe Nikon decided the odds of getting a good pre-capture shot at that framerate wasn't worth the trouble? I don't know.
 
Well said, interesting points that make sense.
Where do you make chips? Curious , always good to meet other folks in IT (seems there are quite a few on these photography forums :)

I make chips that do both. Image compression, which is completely different from data compression, requires a lot of memory accesses, both for synchronization and buffering, which is what consumes most power in a chip. Or, if we count the number of gates, they're in a completely different ballpark.

Post-processing of that nature is usually a very straightforward pipeline in hardware that only involves local operations (often mults and adds, sometimes a LUT) which can be performed on the pixels as they stream by. In comparison, compression is very efficient, too, but the paths are more complicated and involve many different operations that have different timings. You need to do the debayering, convert the spatial pixels into frequency data (wavelets), perform an analysis for the compression and sort out what has to be left out while maximizing the quality (that one is very limiting!), quantize the data, perform arithmetic encoding, and format the stream output. It's quite hard to balance the lengths of all parts of the pipeline to preserve the top performance while not using too much memory, especially at high quality compression rates (the ones used in the HE modes) because they output quite a bunch of data whose size can vary wildly from one area to the next. It's a problem common to all compression algorithms I've seen.

But I don't know exactly what they're doing in their post-processing. It's true that their demosaicing - Nikon's demosaicing used for the post-processing and JPEG compression - is very good, so it must be more complex than just applying a simple interpolation.

Those operations are performed in a separate pipeline in their chip, so as you said, there's no apriori reason not to provide the choice in precapture. Except, as @JAJohnson clarified, pre-release capture is only available for 30/60/120 FPS modes, which can't be achieved in lossy raw compression because of the reasons given above.

So it's a problem of raw performance (pun intended), not a problem of CPU/ISP overhead. We could, theoretically, get a pre-release capture mode at 20 FPS if it were available for other modes than high-speed capture, but for some reason, it's not. Maybe Nikon decided the odds of getting a good pre-capture shot at that framerate wasn't worth the trouble? I don't know.
 
@MartyD update I went out and photographed some next door Bald Eagle this evening. I used mostly HE* with a few shots in losless for comparison. Photo mechanic 6 worked fine for culling. I did not have any high ISO images but I tested Light Room Classic DeNoise on one HE* file anyway not one I would usually use it on but it worked. Apple preview still will not display HE* thumbnails.

No perceived difference in performance at 20fps with the shots I took with ProGrade Digital CF Express Type B memory card (cobalt) 325GB never hit the buffer. After going back to HE* near the end of the birding outing I had switched back to HE* When I hit play back to check ID on some birds the camera froze up and would not turn off ... pulled battery and all returned to normal. Something that Z9, my oldest one, had not done before.

Given on going apple issue and the freeze up with Z9 today while using play back with HE* images I may just stay with losless.
 
Yes fastrawviewer is fantastic. I too have been using it as my raw culling software for several years. It's a steal at it's price point... And it happens to be on sale . https://www.fastrawviewer.com/


For those looking for a quick, fast image viewer, I moved from Photo Mechanic and purchased a program called Fastrawviewer who’d is cross platform and supports the HE* format, the author is also a guru on raw image data.
 
Ever since day 1 I used HE* and never looked back. No complaints whatsoever. For culling pics (thousands of them) I use Fast Stone. Then once selected, I use NXStudio and CNX-2 once converted to JPG. Final touch with Topaz AI (on JPG, yes). Don't honestly need more, I should improve my own technique instead of worriyng extracting the last pixel :LOL:
 
Ugh, I just discovered HE* isn’t supported by Photo Mechanic. If that’s correct, what are good options for ingesting and culling and organizing?
Sort/cull with Photo Mechanic. Convert Raw's to TIFF/ JPG with NX Studio. And finish in Photoshop if necessary.

The beauty of NX Studio for me is it brings in camera adjustments like saturation/ sharpness etc with the raw file so most of the time for sports events, etc. the files can be batch processed or just cropped to save time. I know some would say shoot " neutral" in camera but I havent been able to see a difference.

Have used HE* from the start which is great when shooting events and action sequences which yield thousands and thousands of files.
 
You can simply convert to .dng with Adobe DNG converter (which is free).

A few times, I've exported TIF files from NX Studio (making sure to use 16 bits/channel) and reworked them a little in Darktable. It went rather well if the exposure and WB were already made in NX Studio. But I would definitely not work on JPG files because they're limited to 8 bits per channel. Avoid lossy compression, of course.

The TIF files can be huge, so don't forget to delete them.
 
Back
Top