Official Nikon Z9 Launch, Info, and Discussion Thread

If you would like to post, you'll need to register. Note that if you have a BCG store account, you'll need a new, separate account here (we keep the two sites separate for security purposes).

I just don’t understand why it didn’t grab the eye again immediately
I cant remember where but he posted an evf view of the scene in question and it lasted from start to finish around a second. It was essentially a BB player running full tilt straight at the camera in focus until the ball completely obscured his face. if memory serves me he only got a few frames off after that before being run over by the players. This scenario shot at f1.2 is forgivable and he even admits he questions whether the sony or canon would have done any better.

edited to add: the entire shot sequence lasted 17 frames. assuming he was shooting at 20FPS then everyone is knitpicking the AF that happened in less than a second at F1.2 the first time a photographer used it to shoot basketball lol 1 split second out of an entire basketball game lol
 
Last edited:



This maybe why Hogan prefers his D6 over the A1 for sports action because as he says its stickier, that said the Z9 is behind the A1 so something to consider.

I feel the R1 Canon as i have said previously will be the start of the mid to late 2022 shake up that will see Canon leap ahead in lots of ways, Sony will dleiver their A2 version of the A1 with 6 more versions looming LOL, Nikon will release late 2022 early 2023 the super fast high iso sticky low res pro camera.......
The above image considering the glass and camera used is a litle bit of concern and the 17 before that are tac hmmmm .....spray and shoot to document seems to be in......photogrpahy is out.

Only and opinion......with feet on the ground........Happy days......
 
When Nikon turned the 200-400 into the 180-400 W/TC it resulted in a reduction in length of a fraction of an inch. The rear most element of the 400 E is a good distance from the mount.

I'm also considering the excellent measurement analysis by fcotterill posted in the 800PF thread that estimates the 400S at 390mm compared to the 400E which is 358mm. So I think in the case of a prime lens that adds a TC and has to add flange distance there will be an increase in length. Likely they have more to play with when they shortened the zoom lens from the older 200-400 to modern 180-400 lens groupings.

But the main thing is the relative lengths of the 400 v 600. They are much too close in length for them to both have added a 1.4TC unless they went for vastly different redesigns which seems unlikely.
 
Sorry to interrupt the AF discussion/speculation here, but for what is worth: as of 8 minutes ago, Nikon Europe has charged my credit card.

I don't know exactly what it means in terms of shipping, but now I think there is a chance for either late December or early January. The recent discussions made me think mid Q1 or so.

For context, I've ordered online from the Nikon store in the first or second minute after it was possible - I think there European YouTube channel was still showing the logo and countdown, the Japan one already had people speaking for a few minutes.

Also, I'm not NPS or anything like that - in Switzerland, your main income must be photography to get NPS status. The only thing I get as payback is pleasure of photography and sometimes friends and extended family thanks 😂

I'm betting now they'll attempt to deliver during holidays...
 
This maybe why Hogan prefers his D6 over the A1 for sports action because as he says its stickier, that said the Z9 is behind the A1 so something to consider.

I feel the R1 Canon as i have said previously will be the start of the mid to late 2022 shake up that will see Canon leap ahead in lots of ways, Sony will dleiver their A2 version of the A1 with 6 more versions looming LOL, Nikon will release late 2022 early 2023 the super fast high iso sticky low res pro camera.......
The above image considering the glass and camera used is a litle bit of concern and the 17 before that are tac hmmmm .....spray and shoot to document seems to be in......photogrpahy is out.

Only and opinion......with feet on the ground........Happy days......

Stickiness is normally adjusted with the Focus Lock-On setting in Nikon cameras. But it also means the camera can be slower to pick up a new target because it is stuck on a prior target or an out of focus target. Having the face and then handling a blocked face is simply a setting he probably does not know should be adjusted.
 
Sorry to interrupt the AF discussion/speculation here, but for what is worth: as of 8 minutes ago, Nikon Europe has charged my credit card.

I don't know exactly what it means in terms of shipping, but now I think there is a chance for either late December or early January. The recent discussions made me think mid Q1 or so.

For context, I've ordered online from the Nikon store in the first or second minute after it was possible - I think there European YouTube channel was still showing the logo and countdown, the Japan one already had people speaking for a few minutes.

Also, I'm not NPS or anything like that - in Switzerland, your main income must be photography to get NPS status. The only thing I get as payback is pleasure of photography and sometimes friends and extended family thanks 😂

I'm betting now they'll attempt to deliver during holidays...
wow, that's great news for you. Just checked, and I have not been charged. Hopefully, it will be later, you are in a six-hour time zone ahead of me.
 
Last night, I attended a Nikon Owners Club online Q&A seminar: led by Nikon UK's Ricci Chera and Neil Freeman. They answered questions about some details of the Z9 but more about the new Z lenses. The major portion of the 1 hour was dedicated to these lenses, including the 800 f6.3S PF - some feedback here

They both said the one question "we CAN"T answer is when the Z9 is shipping...". They themselves have no fixed date - yet - to the question to Nikon UK Logistics as to when they will receive their own Z9 cameras. They asked this on Tuesday and still hadn't heard....When!
 
Last edited:
Not trying to keep this line of discussion alive, but I think what we're getting at is it's not clear if the modes/settings used were optimal for the desired behavior.

But sure, let's assume they are.

It's interesting we're getting to the point where we're arguing who can better get pin-sharp results of recently obscured, moving objects at f1.2.
I think you nailed it. AF has come so far over the last 2 years, we are discussing nuances that we didn't even know we could have a few years ago :)
 
One last point about Fros report and the missed shot.... he may have had auto subject detection on as well. If that's the case the camera is likely looking for all sorts of objects including birds and cars lol and could easily be confused. The reality of it is a basketball is shaped very much like a head and the fact that the focus jumped to it shouldnt surprise anyone. I just dont think there is much to be made of a couple missed shots at f1.2 over the course of an entire game.
 
No matter how advanced the AI algorithms get, it still has to work in tandem with the other user defined settings.
AF tracking with lock on is basically a setting that directs the AF to act in a specific manner. When you are tracking a subject and there's an obstruction all of a sudden, how do you want the camera to AF in such a situation? Should it ignore the obstruction and stick with the primary subject you are tracking ( select a setting of 4-5) or do you want the camera to ignore your primary target and i stead focus on the obstruction (setting between 1-3). Latter setting is generally what causes the AF box to be too sensitive/ jumpy as it is adjusting for every minor change.

Like I mentioned in the other post, with the Z9 it is possible to move between a sensitivity setting of 1 to 5 with a press of a button so this is an amazing customization option for wildlife/sports use case.


Let me reiterate.

The AI is programmed to understand & recognize the eye based on size, shape, colour or whatever other inputs.

So if one has chosen eye tracking/AF within the entire frame, & if there is one or a set of eyes, the AF system should track the eyes alone & nothing else.
Nothing else should distract it, unless it is another set of eyes.
It has absolutely nothing to do with sensitivity. Jumping between one set of eyes to another set of eyes is affected by sensitivity, but it wouldn't & shouldn't affect the AF jumping from the eyes of a basketball player to the basketball (non-eye subject) in the foreground or from the eyes of say a tiger to a random twig elsewhere.

If it does get distracted, it means the 'existing software' with whatever juice it is getting (hardware & light etc.) is not good enough to identify & track the eyes all the time.

A good comparison would be how Tesla manages to make self-driving cars. It is the software which works like an intelligent person, identifying different obstacles, cars etc. That's how it knows how much to accelerate & when to brake etc. A few years ago Tesla couldn't do that. Now it can with better software & hardware.

We know Z9 has adequate hardware to drive the AI. Just that the AI implemented has to be smart enough in the first place.

It is only a matter of time before the AI gets better at the task. It already happens incrementally with firmware updates. We just need bigger leaps.
 
...A good comparison would be how Tesla manages to make self-driving cars. It is the software which works like an intelligent person, identifying different obstacles, cars etc. That's how it knows how much to accelerate & when to brake etc. A few years ago Tesla couldn't do that. Now it can with better software & hardware.
...

Would you sit in the back seat of a Tesla without a driver on a winding mountain road without guardrails? I wouldn't! ( Just as I wouldn't ride with some humans over the same route.) We're still a long way from human-level machine thought.
 
Maybe it's not a good idea to imagine what could be the reasons for polin's autofocus "failures" and the hypothetical reasons. In some days or weeks there will be plenty of data available from many sources to make a better estimate.
On a side note regarding the so-called "AI", it's a delicate subject to train a DNN with maximum efficiency at multiple tasks. Too much data leads to algorithm efficiency degradation and I think it'll always be better to constrain the camera into one specific task rather than full auto. With too many inputs, at some point some unwanted objects will look like what you're trying to categorize.
 
So if one has chosen eye tracking/AF within the entire frame, & if there is one or a set of eyes, the AF system should track the eyes alone & nothing else.
Nothing else should distract it, unless it is another set of eyes.

The only problem with this argument is that one does not select "Eye Tracking" on the Z9, rather "Subject Tracking". And from what I understand while the eyes are prioritized of course, if eyes are not found the head comes next, followed by torso/body, etc. So while yes if eyes are present one would like to think the camera would stay with them, from my understanding of the algorithms to say "the AF system should track the eyes alone & nothing else" is not strictly correct.
 
  • Like
Reactions: Hut
Uh, why? If I tell the camera "don't stick on a subject if obscured", would should it stick on the subject? The camera cannot know you wanted it to stick on the eyes if you set it differently.

You also want the camera to shoot in AF-C even if you set it on AF-S, just because your subject has moved?


In this case with human eye tracking on, it should focus on the eyes. If, & only if there are no eyes, it should focus on the face. If no face, then the trunk of the body.

In the pic Jared has shared, the eyes are visible & yet it doesn't have the eyes tack sharp. Nothing is 100%, but there is plenty of room for improvement in AI of different brands to identify the eyes. In this case it is the Z9.

Z9 does 120 calculations per second & can focus at F22 & -6 EV or something. So clearly it is not the hardware. So irrespective of the sensitivity, as soon as the eyes are seen in the frame, the AF point should bang on the eyes.

It is quite simple from where I see it. The AI has to get better. I am sure all 3 brands are feeding data & are trying to improve it even as we speak...
 
Last night, I attended an online Nikon Owners Club online Q&A seminar with Nikon UK Ricci Chera and Neil Freeman. They answered questions about some details of the Z9 but more about the new Z lenses. The major portion of the 1 hour was dedicated to these lenses, including the 800 f6.3S PF - some feedback here

They both said the one question "we CAN"T answer is when the Z9 is shipping...". They themselves have no fixed date - yet - to the question to Nikon UK Logistics as to when they will receive their own Z9 cameras. They asked this on Tuesday and still hadn't heard....When!
I'm curious about why they "CAN'T" answer that question. Especally with reports here that Nikon Europe is charging credit cards of people who preorded Z9s. It sounds like distribution is beginning/ongoing.

Other than that, what are your thoughts on the presentation?
 
Last edited:
Yes I think that is fine personally. That is why 120 per second is a big deal.
otherwise, what is it supposed to do, predict the next position of the eye? It’s not the same as catching something in the foreground if the face is blocked momentarily??? 🤷‍♂️
thats what I think of when I see “sticky”. It is staying locked on while the eye is visible without jumping to something else.

If the eyes & face was blocked, & as human eye tracking is chosen, then it should track the trunk of the body of the basketball player & not the basketball.
 
Let me reiterate.

The AI is programmed to understand & recognize the eye based on size, shape, colour or whatever other inputs.

So if one has chosen eye tracking/AF within the entire frame, & if there is one or a set of eyes, the AF system should track the eyes alone & nothing else.
Nothing else should distract it, unless it is another set of eyes.
It has absolutely nothing to do with sensitivity. Jumping between one set of eyes to another set of eyes is affected by sensitivity, but it wouldn't & shouldn't affect the AF jumping from the eyes of a basketball player to the basketball (non-eye subject) in the foreground or from the eyes of say a tiger to a random twig elsewhere.

If it does get distracted, it means the 'existing software' with whatever juice it is getting (hardware & light etc.) is not good enough to identify & track the eyes all the time.

I’m really not try to keep this thread alive, but I disagree. If you are combining eye detection and a tracking mode, which I assume is in play here, the eye detection is secondary to the subject. Eye detect may aid the system in identifying a subject, but once it goes into tracking mode it’s tracking that subj. Eye detection at that point simply prioritizes what part of the subject it focuses on. My point here is that if the system looses track of the subject it must re-aquire a “new” subject and how that happens will depend on lots of things including the settings. In fact having the blocked shot cranked to 5 may have hindered in this case because it may not have given up on the previous subject and doesn’t realize the new subject is the same one. In reality we don’t really know how these systems work, but my point here is I don’t think your theory fits well unless using a full auto af mode (not tracking). And when we consider tracking modes, it matters when the camera looses its subject and we don’t really have good visibility into that (afaik). it may be obvious to us what the subject is, but i suspect it’s a lot less obvious to the camera. ymmv
 
Not trying to keep this line of discussion alive, but I think what we're getting at is it's not clear if the modes/settings used were optimal for the desired behavior.

But sure, let's assume they are.

It's interesting we're getting to the point where we're arguing who can better get pin-sharp results of recently obscured, moving objects at f1.2.

Yeah, we totally should. We have self-drive Teslas don't we, which can kill people if badly done...also combustion Ferraris are kinda lame now in terms of acceleration...electrics are way better.

BTW, soul-sucking corporates are allowing WFH options even in developing countries...

Surely, f1.2 eye af stuff is the least we can expect :D
 
  • Like
Reactions: Hut
It is quite simple from where I see it. The AI has to get better. I am sure all 3 brands are feeding data & are trying to improve it even as we speak...
I hope they are each studying the others product to improve their own. While patent and copyright laws apply in these cases, a feature in one product may be arrived at in a different manner or improved in another legally. In other words, ideas beget better ideas.
 
Last edited:
Yeah, we totally should. We have self-drive Teslas don't we, which can kill people if badly done...also combustion Ferraris are kinda lame now in terms of acceleration...electrics are way better.

BTW, soul-sucking corporates are allowing WFH options even in developing countries...

Surely, f1.2 eye af stuff is the least we can expect :D
I'll still take the Enzo car over the Tesla at LeMans and Sebring! ;)
 
Back
Top