It’s not really factually correct if you want to get pedantic, both brains and llms are called black boxes for different reasons, but this is ultimately irrelevant. Your motive may be here or there, the rhetorical effect is the same. You are arguing very specifically that we cant know llm’s dont hae similar features (world model) to human brains because “both are black boxes”, which is wrong for a few reasons, but also plainly an equivalence. It’s rude to pretend everyone in the conversation is as illiterate as wed need to be to not understand this point.
Yes we agree on the first part.
I will again direct you here re: the second.