Your question: Given two cameras in different XY locations along the same local Z axis, can you adjust the settings in one to match the view of the other?
Short answer: No.
Medium answer: Not really. You can adjust the field of view of one so that the edges of the view are at just about the same spot (the cones of each camera intersect the walls in the same spot), but the sizes of objects in frame and their location in relation to each other will not match.
Longer answer: I think you might be trying to solve a different problem than the one you should be trying to solve. In your video, I didn't see the actual photo you're trying to camera match. But if your model is accurate, then your 3D camera should be exactly in the same place as the real camera when the photo was taken. Presumably, that should be inside the room. If it isn't in the same spot, you don't have an accurate camera match.
For an accurate match, you need the physical location and orientation of the camera as well as the focal length of the lens and the camera's sensor width.
You could get dial in the accuracy even more by capturing a distortion map with your specific camera to match barrel distortion with wide angle lenses (or play with the distortion amount in the Max camera), but I've found that location/orientation/focal length/sensor width get me close enough.
The other important detail with a camera match is that your backplate photo must not cropped/rotated. Render your image and composite it with the backplate first, then crop afterwards.
Logged
i9-12900K @ 3.2GHz, 64GB RAM, 3090ti
Max 2024, Corona 10