LLMs are less replacing the need for log tables, and more replacing the need to understand why you need a log table. Less replacing a calculator and more replacing the fundamental understanding of math. Sure, you could argue that it doesn’t matter if people know math, and in the end you might be right. But given that ChatGPT can and will spit out random numbers instead of a real answer, I’d rather have someone who actually understands math be designing buildings, people who actually understand anatomy and medicine being surgeons. Sure, a computer science guy cheating with ChatGPT through school and his entire career probably won’t be setting anyone back other than himself and the companies that hire him, but they aren’t the only ones using the “shortcut” that is ChatGPT
That wasn’t my experience in school, but there’s a good chance you were just in an introductory class or similar. However, that doesn’t change anything about my argument. If you need the log of something, you knew that you needed to look up the log in a table to solve the problem. ChatGPT removes the need to even understand that you can use a log to solve a problem, and instead spits out an answer. Yes, people can use ChatGPT to accelerate learning, as one would a calculator, and in those instances I think it’s somewhat valuable if you completely ignore the fact that it will lie to your face and claim to be telling you the truth. However, anecdotally I know quite a few folks that are using it as a replacement for learning/thinking, which is the danger people are talking about.
Personally dont think that’s a good comparison. I would say it’s more like taking a photo and claiming you know how to paint. You’re still actually cre a ting something, but using a digital tool that does it for you. You chose the subject and fiddle with setting to get a better image closer to what you want and then can take it into software to edit it further.
Its art in its own right, but you shouldn’t directly compare it to painting.
Even that is a bad analogy, it’s like commissioning a painter to paint something for you, and then claiming you know how to paint. You told an entity that knows how to do stuff what you wanted, and it gave it to you. Sure, you can ask for tweaks here and there, but in terms of artistic knowledge, you didn’t need any and didn’t provide any, and you didn’t really directly create anything. Taking a decent photo requires more knowledge than generating something on ChatGPT. Not to mention actually being in front of the thing you want a photo of.
I mean i think i explained myself quite well already, and not to be insulting to you, but i dont think you’re willing to accept any argument i would make that goes against what you already beleive, since your argument against it simply you asserting your own beliefs (that AI art isnt art) as an immutable fact
Oh, I’m not saying AI art isn’t art. It is. I’m just saying that the person writing the prompt didn’t create it, or do anything remotely skilled or artistic to get the result.
Okay, but if theyre the one writing the prompt, changing parameters and pressing the button to generate it how are they not the one creating it?
And i do think photography is pretty analogous here. Anyone can point a phone camera st something, hit one button and generate something. It takes no skill or artistic talent to do so and the phone is whats doing all the work, but its still creating art. And just like AI, people can put more effort into it, coming up with a creative subject, fine tuning different setting to get the effects they want, or even using different devices/models to get different images, and retaking it multiple times to get something theyre happy with, then touching it up in editing software.
There’s a key difference between using a tool to crunch a known mathematical equation (because you cannot just say “find X” to the calculator) and having to punch in the right inputs - ergo requiring understanding - and simply asking the teacher for the answer.
Treat AI like the hermit oracle/shaman/divinator of yesteryear, and you’ll get the same results - idiots who don’t know how to think for themselves, and blindly accept what they are told.
Same with using a calculator, no? Or not memorising log tables.
LLMs are less replacing the need for log tables, and more replacing the need to understand why you need a log table. Less replacing a calculator and more replacing the fundamental understanding of math. Sure, you could argue that it doesn’t matter if people know math, and in the end you might be right. But given that ChatGPT can and will spit out random numbers instead of a real answer, I’d rather have someone who actually understands math be designing buildings, people who actually understand anatomy and medicine being surgeons. Sure, a computer science guy cheating with ChatGPT through school and his entire career probably won’t be setting anyone back other than himself and the companies that hire him, but they aren’t the only ones using the “shortcut” that is ChatGPT
I was never taught what log tables actually are. Anytime logarithms were brought it, it was just “type it in to your calculator and it will tell you”
That wasn’t my experience in school, but there’s a good chance you were just in an introductory class or similar. However, that doesn’t change anything about my argument. If you need the log of something, you knew that you needed to look up the log in a table to solve the problem. ChatGPT removes the need to even understand that you can use a log to solve a problem, and instead spits out an answer. Yes, people can use ChatGPT to accelerate learning, as one would a calculator, and in those instances I think it’s somewhat valuable if you completely ignore the fact that it will lie to your face and claim to be telling you the truth. However, anecdotally I know quite a few folks that are using it as a replacement for learning/thinking, which is the danger people are talking about.
Better comparison would be opening a song on radio and saying “see I can produce music.” You still don’t know about music production in the end.
Personally dont think that’s a good comparison. I would say it’s more like taking a photo and claiming you know how to paint. You’re still actually cre a ting something, but using a digital tool that does it for you. You chose the subject and fiddle with setting to get a better image closer to what you want and then can take it into software to edit it further.
Its art in its own right, but you shouldn’t directly compare it to painting.
Even that is a bad analogy, it’s like commissioning a painter to paint something for you, and then claiming you know how to paint. You told an entity that knows how to do stuff what you wanted, and it gave it to you. Sure, you can ask for tweaks here and there, but in terms of artistic knowledge, you didn’t need any and didn’t provide any, and you didn’t really directly create anything. Taking a decent photo requires more knowledge than generating something on ChatGPT. Not to mention actually being in front of the thing you want a photo of.
I think my analogy is more accurate
Care to explain? I think your analogy gives the credit of art creation to someone who didn’t create art, and thus is flawed.
I mean i think i explained myself quite well already, and not to be insulting to you, but i dont think you’re willing to accept any argument i would make that goes against what you already beleive, since your argument against it simply you asserting your own beliefs (that AI art isnt art) as an immutable fact
Oh, I’m not saying AI art isn’t art. It is. I’m just saying that the person writing the prompt didn’t create it, or do anything remotely skilled or artistic to get the result.
Okay, but if theyre the one writing the prompt, changing parameters and pressing the button to generate it how are they not the one creating it?
And i do think photography is pretty analogous here. Anyone can point a phone camera st something, hit one button and generate something. It takes no skill or artistic talent to do so and the phone is whats doing all the work, but its still creating art. And just like AI, people can put more effort into it, coming up with a creative subject, fine tuning different setting to get the effects they want, or even using different devices/models to get different images, and retaking it multiple times to get something theyre happy with, then touching it up in editing software.
There’s a key difference between using a tool to crunch a known mathematical equation (because you cannot just say “find X” to the calculator) and having to punch in the right inputs - ergo requiring understanding - and simply asking the teacher for the answer.
Treat AI like the hermit oracle/shaman/divinator of yesteryear, and you’ll get the same results - idiots who don’t know how to think for themselves, and blindly accept what they are told.