I have been taking a deep dive into the AI craze of late and while there is still a lot to figure out, I have a huge hope about how we can use this tool in the classroom as an learning aid. One issue that I have seen in so many classrooms, including Physics classes that I taught in the past, is that when students are working on assignment problems and do not understand how to solve them, they will most likely just give up if they get stuck and do not have support readily available to them.
If we are assigning homework to be completed outside of the classroom, their success often can depend on their current level of understanding and the support system they have at home. This means that either the students who already gets it, does not need to do the homework, and the student that does not already get it, might not be able to be successful if they do not have the right support system at home to get them to be successful for this particular activity.
ChatGPT, and other AI programs, could help solve these issues by allowing students to enter the questions into the programs and have it show them how to solve the problems. They could ask questions and even have a conversation with the program about the process used to solve the problem. If they do not understand a term or concept, they can get further explanations to help them learn. They can even ask it to explain it in different ways that best fit their specific needs or even have fun while learning by having it speak like a pirate.
To test the current level of ChatGPT to do this, in regards to Physics problems, I gave it some problems to solve. In the first problem I gave it, Chat GPT was able to produce the correct answer and process to solve the problem. It explained what it was doing and if I was confused, I could have asked it questions about why it did what it did. This was awesome as I could easily see this helping a student who is stuck on a problem or who just wants to know if they were solving it correctly.

In the second problem that I gave to ChatGPT, it showed the right steps but actually failed to do the last mathematical step and while the steps were correct, the final answer was wrong. Actually, this is a very common thing that students will do so it seemed almost human to me. This does mean that that ChatGPT, at least version 3.5, is not ready to be trusted all the time but that does not mean we should not consider it.

While the second answer was wrong, there was still much that we could learn from this process. If a student just copied the answer, they of course would be wrong sometimes but if they had to show their work, this could help me understand where they went wrong. I can also provide feedback to ChatGPT to help it improve its own processes in order to have it create better answers in the future. I could have used this as a teacher to tell students to ask it problems and then prove to me that the answer that it produces, was correct. This could be a great review practice and since ChatGPT is being connected to so many programs and sites, this means that we could access it in a lot of different ways.
I still think this AI program, like many others, can eventually be the tool that could help students when they do not have anyone around to help. I remember many conversations with the parents of my Physics students who were always amazed at what their students were able to do, since not many parents were really good at Physics. This means that if used correctly, this type of program can actually open up many complex subjects to more students if they have the right resources to help them as they learn. Students do not come into our classrooms with the same level of support at home and this could be a way we make it more equitable but it should never be the only way.