The whole “maybe if the homework can be done by a machine then its not worth doing” thing is such a gross misunderstanding. Students need to learn how the simple things work in order to be able to learn the more complex things later on. If you want people that are capable of solving problems the machine can’t do, you first have to teach them the things the machine can in fact do.
In practice, compute analytical derivatives or do mildly complicated addition by hand. We have automatic differentiation and computers for those things. But I having learned how to do those things has been absolutely critical for me to build the foundation I needed in order to be able to solve complex problems that an AI is far from being able to solve.
Maybe if homework can be done by statistics, then it’s not worth doing.
Lots of homework can be done by computers in many ways. That’s not the point. Teachers don’t have students write papers to edify the teacher or to bring new insights into the world, they do it to teach students how to research, combine concepts, organize their thoughts, weed out misinformation, and generate new ideas from other concepts.
These are lessons worth learning regardless of whether ChatGPT can write a paper.
It does feel like some teachers are a bit unimaginative in their method of assessment. If you have to write multiple opinion pieces, essays or portfolios every single week it becomes difficult not to reach for a chatbot. I don’t agree with your last point on indoctrination, but that is something that I would like to see changed.
Even if the prompt is clear, the ask is a trap in and of itself. Because it’s not possible to actually do, but it will induce an LLM to synthesize something that sounds right.
If it was not ‘hidden’, then everyone would ask about that requirement, likely in lecture, and everyone would figure out that they need to at least edit out that part of the requirements when using it as a prompt.
By being ‘hidden’, then most people won’t notice it at all, and the few that do will fire off a one-off question to a TA or the professor in an email and be told “disregard that, it was a mistake, didn’t notice it due to the font color” or something like that.
This sounds fake. It seems like only the most careless students wouldn’t notice this “hidden” prompt or the quote from the dog.
Maybe if homework can be done by statistics, then it’s not worth doing.
Maybe if a “teacher” has to trick their students in order to enforce pointless manual labor, then it’s not worth doing.
Schools are not about education but about privilege, filtering, indoctrination, control, etc.
The whole “maybe if the homework can be done by a machine then its not worth doing” thing is such a gross misunderstanding. Students need to learn how the simple things work in order to be able to learn the more complex things later on. If you want people that are capable of solving problems the machine can’t do, you first have to teach them the things the machine can in fact do.
In practice, compute analytical derivatives or do mildly complicated addition by hand. We have automatic differentiation and computers for those things. But I having learned how to do those things has been absolutely critical for me to build the foundation I needed in order to be able to solve complex problems that an AI is far from being able to solve.
deleted by creator
Lots of homework can be done by computers in many ways. That’s not the point. Teachers don’t have students write papers to edify the teacher or to bring new insights into the world, they do it to teach students how to research, combine concepts, organize their thoughts, weed out misinformation, and generate new ideas from other concepts.
These are lessons worth learning regardless of whether ChatGPT can write a paper.
It does feel like some teachers are a bit unimaginative in their method of assessment. If you have to write multiple opinion pieces, essays or portfolios every single week it becomes difficult not to reach for a chatbot. I don’t agree with your last point on indoctrination, but that is something that I would like to see changed.
Removed by mod
Even if the prompt is clear, the ask is a trap in and of itself. Because it’s not possible to actually do, but it will induce an LLM to synthesize something that sounds right.
If it was not ‘hidden’, then everyone would ask about that requirement, likely in lecture, and everyone would figure out that they need to at least edit out that part of the requirements when using it as a prompt.
By being ‘hidden’, then most people won’t notice it at all, and the few that do will fire off a one-off question to a TA or the professor in an email and be told “disregard that, it was a mistake, didn’t notice it due to the font color” or something like that.