By Zhang Huixin (23J17)
Image taken from: ScienceNewsExplores
ChatGPT is the hot new topic recently, with its seemingly infinite abilities causing a frenzy among people, both those who wish to greatly serve their future AI overlords, and those afraid that ChatGPT and their likes will take their jobs away from them. ChatGPT does seem all-powerful – it can answer your oddly specific questions, help you organise your itinerary, and if you want to, when combined with the right tools, can produce automated hand-written homework!
If only I had continued learning DnT…
Of course, not all students will go to that extreme step of automating their homework entirely, though I’m sure most have tried to use ChatGPT and its all-knowing computer knowledge with their homework. And so we arrive at the real question everyone wants to know: Can ChatGPT or AI help us with our homework? After all, laziness is human nature, and I am sure every student wishes to lessen their workload or just simply have their brain be fried less when doing homework.
Well, from my experience as a JC1 computing and physics student, I can safely say that ChatGPT is quite…bad when it comes to helping me with computing and physics. Given that ChatGPT is a computer programme, would you not think it should be proficient at least in programming?
However, I have encountered times when I would feed in the wrong answer and ChatGPT will still tell me that my answer was right (I'm flattered, but I was not right!). As a student who is weaker in physics too, I have tried feeding in physics questions for the AI to solve, and yet they provided me with wrong answers!
This goes on for more than a few times T.T
It’s not just the sciences that seem to trip over ChatGPT, but even the languages. When asked to summarise a play and given the basis of the plot, ChatGPT, instead of clarifying or asking for more details, will confidently produce an incorrect summary with completely foreign details.
This is known as Hallucination, a phenomenon where the AI, when lacking information, just confidently makes up things and presents them as facts. This means that whenever you use ChatGPT to generate something, there is the inherent risk of the facts being inaccurate and sometimes straight-up fictional, making cross-referencing the generated content necessary.
This picture is taken from the Wikipedia page on AI hallucination. The link given does not exist, yet a summary of the imaginary article is being confidently presented as a fact
That is not to say that ChatGPT is useless, of course. It has been proven to be a useful tool to generate ideas with, especially for Project Work, CID, or English essays. Its ability to quickly come up with multi-faceted points is useful and time-saving, and also helps students to dive deeper by exploring more points than they would usually have.
Furthermore, its ability to come up with questions quickly can also be a powerful tool if utilised properly, making it a cheap and accessible resource for more revision questions for science subjects. For the humanities, you can also utilise it to give you essay questions to practise with. With the proper mindset, you can easily make ChatGPT into a powerful revision tool.
To get a better understanding of the perception of ChatGPT as a learning tool, I interviewed 3 different people – Dylan Zihong Saga (23J17), a JC 1 science student taking P(Computing)ME; Damien (23J19), a JC 1 humanities student taking HELM; and Ms Candida Ho, a GP teacher. The results varied, but ultimately the same – ChatGPT should be used as a supplement to learning, and not to directly answer questions as its ability in that area is still extremely limited.
When interviewing Ms Ho, I provided her with a GP essay that was 100% generated without telling her beforehand. She could immediately tell it was ChatGPT generated! When asked how, she said, “The style of (ChatGPT) is very generic. It writes like a textbook…So it is very systematic, structured, clear. But natural writing…has more styles. Natural writing has varied sentence lengths and sentence structure.”
She added that she, too, used ChatGPT in her GP lessons as a tool. She would get students to first brainstorm, and then check against the points given by ChatGPT. Adding on, she mentioned that ChatGPT is weak in giving local examples, which is necessary to score an A for GP. She remains optimistic about ChatGPT as a learning tool, however, and believes that it can help weaker students to better understand the basics.
Interestingly (or maybe naturally), there is a disparity between how humanities and computing students choose to engage with ChatGPT. Damien, the humanities student, says that he has never used ChatGPT to help with his humanities homework, adding that this is the common consensus with humanities students, as “humanities subjects often have assignments in which the grading system depends heavily on writing style…and it is somewhat easy to tell… if something was written by a human or an engine.”
It seems that ChatGPT’s key weakness is its lack of a natural writing style. Damien believes that ChatGPT should only be used as a tool for ideation and one should still possess the know-how to discern between relevant and irrelevant ideas.
Conversely, Dylan, the computing student, has been frequently using ChatGPT to aid him, such as asking ChatGPT to clarify economics concepts or guiding him through difficult computing questions. He has also used ChatGPT as a subject tracker, helping him keep track of his different lecture dates, homework assignments, and doubts, as well as a career guidance tool, asking it to evaluate his portfolio and internship opportunities.
Another way ChatGPT can be used to help you in your studies!
However, both agree that when it comes to problem-solving, ChatGPT is not the optimal tool for it and that this AI should only be used as an assistive learning tool.
Ultimately, generative AI is still a new field. ChatGPT itself is not even a year-old, and it is hard to predict how it will change the world in the long term, especially in the education sector. For now, to make students’ lives easier, yes, AI can help you with homework!
However, its ability to directly generate answers has much to be desired still, and the effort used to modify prompts until you can get exactly what you want is frankly not that worth it. Not to mention, the presence of anti-chatGPT checkers such as ZeroGPT makes such attempts even harder. Homework is for us to reinforce the concepts we learned in school, so using ChatGPT to generate answers directly defeats the purpose and doesn't help us learn.
Let us embrace this glorious new age of AI as productivity tools, and not as our tell-all crystal ball!
Comments