Double Standard
- Editorial Board

- Nov 23
- 4 min read
Maclay School Needs to Rethink Its Approach to AI

In recent years, Artificial Intelligence (AI) has been consistently present in business, politics and now even in the Maclay community. Despite positive intentions, Maclay’s approach towards AI reveals a double standard: students are discouraged from using AI, while administration embraces it for experiments and official branding. Maclay's use of AI is more negative than positive.
Maclay is a business, and in any business, branding is a major component. One of the most prominent aspects of that branding is Rowdy, the school’s mascot, whose image is present across various platforms. However, some people have noticed that images of Rowdy look less like the work of a human artist and more like AI-generated or stock illustrations.
“I think, simply put, [the use of AI art rather than ones by artists is] a missed opportunity,” junior Lily Money said. “Instead of inviting their community of insanely talented artists to make art for the school, they delegate the work that many people would be happy to do to a robot — which they publicly stand against doing when it comes to academic work. I don't understand how it is any different.”
“I do use chat[GPT] for Rowdy,” Head of Communications Angie Herron said. “[Editing the photo]’s where my graphic design background can come in. I can take an image and manipulate it myself in photoshop. So I will use it a lot as a starting point.”
By using AI-generated art for branding, Maclay risks normalizing plagiarism and undermining the originality it expects from students. The school’s use of AI also discourages creative communities of students who are passionate about art, as it shows them being replaceable by a robot. Ultimately, if the school can turn to AI for branding, it presents the question of why students shouldn’t be allowed to use the same tools as well.
“I personally feel that it’s a bit hypocritical,” junior Lily Smith said. “For aspiring artists, I think [the school’s usage of AI art] is a hard pill to swallow. Of course, Maclay doesn’t want students to be dissuaded, but I think they also don’t understand the full extent to how much simple actions like this can spark an issue.”
However, not all of Maclay’s approaches towards AI are negative. One noteworthy positive is the developing Center for Math and Artificial Intelligence building, which is currently being constructed. Specifically, the building’s AI Lab that could include spaces for research and preparation for the future. By investing into programs like this, Maclay allows students to be educated on AI, preparing some for a possible career in the industry. It is a development that keeps up with the times.
“We have to make sure you guys have a full tool belt [on AI] by the time you leave here,” Head of School James Milford said. “What I hope is that we approach [AI] out of opportunity. I hope we can be a little more open-minded in terms of how you use it, because there’s different levels.”
Another case of Maclay’s use of artificial intelligence is the “Ask Rowdy” chatbot on the maclay.org website. Supporters see it as a positive addition because it offers convenience for simple questions. When it can’t provide an answer, it directs users to a possible human contact to provide the information instead. However, the chatbot’s performance is inconsistent. For instance, if you ask when Maclay was founded, sometimes it gives the correct answer, while other times it claims to not know. Why would the school choose to launch a tool that is currently incomplete and unreliable?
At the same time, the school's creation of a new AI Taskforce shows a more thoughtful approach. The group, made up of administrators and teachers, is experimenting with the potentials of AI. The taskforce is looking into policies regarding AI usage and experimenting with the technology to see if it can be implemented in an ethical way to support student learning. This is what the school should be doing: treating AI as an instrument to learn rather than a tool for faster efficiency. The English department is taking a step towards this approach by updating its policy which allows for students to ethically experiment with AI under guidance.
“Part of what [the AI taskforce] has to do is read about it, experiment with it,” Norment, who is on the taskforce, said. “The other outcome is to think about if and how [AI] can be implemented to help learning so it’s not a total ban, that's not the goal.”
However, there’s also the issue of AI’s environmental impact: it requires vast amounts of electricity and water to operate. In 2022, data centers consumed 460 terawatt hours of electricity, making them the 11th largest electricity consumer in the world. Each AI query on large language AI uses five times more electricity then a web search, and the water required to cool servers, along with the water pollution created during hardware production, raises further concerns. By adopting these AI tools solely for efficiency, Maclay contributes to an industry with hidden costs that could send the wrong images to students about responsibility and sustainability.
“The environmental impact of AI, I think, is problematic,” English teacher Lee Norment said. “I have ethical concerns about the technology as a whole. I think the key is just to be mindful about how it’s used and when it’s used.”
Though Maclay is taking positive steps regarding AI, their usage of it raises more problems than benefits: sending mixed signals to students, overlooking important consequences, and prioritizing efficiency over authenticity. Overall, Maclay doesn’t need to abandon its AI efforts, but it does need to rethink its approach, considering whether its business-driven shortcuts do more harm than good.




Comments