They don’t always use the latin alphabet. In University I hated my prof using the same letter over and over again in different writing systems. x, chi, Gothic x, x with hat, x with dash, x as a vector etc. etx.
This was crazy hard for me as I internally verbalize when I read formulae, so I had to “invent” different pronunciations for evey different version of x. Because (for example) one is the vector, but the lowercase latin version is just the length of the vector.
Along with the fact that people use slightly different conventions and then conventions in math are different in the anglosphere vs here - I frequently couldn’t understand a paper or script without having an idea how things worked in the first place. A didactical nightmare.
In programming things are a lot easier, because there’s much more common convention for the field not being a few hundred years old.
Aaah, maybe that’s even the simple answer to your overall post: Conventions, even if they are arbitrary, make things easier to understand.
I see i, j - I think counting loop variable. Does it matter it is i? No. Do I think “index”? Nope.
There are code bases where some clever persons used (for example) g or p for loop variables, they are just a tiny bit harder to read - until I get used to THAT convention.
When I write code, I always try to mirror, what’s already there, to make it easier for the next guy - even if I don’t like the style.
And is Long John Thomas a word?