I'm wondering whether anyone can explain the way AF tracking and subject detection interact in these two cameras. To clarify what I mean, here's an explanation of the same thing in the Olympus OM-1 (AIAF = subject detection here):
FWIW, Olympus recommends NOT using AIAF and tracking AF at the same time.
Of course, this discussion applies only to the OM-1. Can anyone explain how the A1 and Z9 are different or similar to this and/or each other?
"My summary of how AIAF and Tracking AF work…
- AIAF and Tracking AF are two fundamentally different and separate systems.
- AIAF looks at the scene in front of the camera the way a human would, identifying subjects by their appearance. Once a subject has been identified, the camera uses the PDAF pixels in that specific area to set focus.
- Tracking AF is the conventional AF that we’ve long been familiar with: It uses a combination of distance (from the PDAF pixels), color and shape to identify the subject (or to follow one that you’ve told it to via the user interface), then uses its movement over time to predict its likely future position. The color and shape help it avoid being confused by other objects in the scene as the subject moves around, but there isn’t the sort of AI-based “intelligence” to recognize an object as a specific type of subject.
- If you’re doing continuous shooting with AIAF, the camera is basically re-identifying the subject in each frame and then focusing on it. It doesn’t make any predictions about the subject's future position based on past behavior."
FWIW, Olympus recommends NOT using AIAF and tracking AF at the same time.
Of course, this discussion applies only to the OM-1. Can anyone explain how the A1 and Z9 are different or similar to this and/or each other?