CORONAVIRUS (COVID-19) AND SCSBC’S RESPONSE: Find the lastest updates here for students, faculty and staff.

Subscribe to the newsletter

    Aid or Ailment: Developing a Framework for Mission-Directed AI Integration

    AI

    In the first week of school, my daughter received an assignment with minimal instructions. She asked her teacher for help five days before the due date but was left waiting, slowing her progress. With two days remaining, her anxiety grew, and her optimal state for learning vanished. Still, no support was provided. Larger systems also contribute to the lack of student support.

    Frustrated, she turned to me for help, and I turned to AI tools. Within seconds, AI-generated answers to her initial set of data. When asked to explain the process, AI provided a step-by-step guide to calculate total error and standard deviation. Using this approach, she worked through the remaining data sets. I imagine it won’t be long before an AI company’s slogan is, “From tears to understanding in one prompt.”

    AI is already integrating into your school. The larger question is: will it quietly enter through the back door via students and parents frustrated by a lack of support or simply taking shortcuts? Or will you intentionally step into this space, using your mission and vision as a framework for thoughtful AI implementation?

    Mission statements are meant to cast a vision for the future while answering the question: “How should we live now?” We live in a cultural moment, an “AI moment,” with a potential impact comparable to that of the printing press, agriculture, the wheel, and the internet.

    Your mission is based on principles guiding your organization into the future—a future where AI will be a dominant force. Christian institutions must ask, “How do we equip students to be a faithful presence in an AI-driven world?” Suppose we are not walking with students and families on this journey. In that case, technological advancements will be shaped by competing narratives of self-gratification and individualism, winning over the hearts of our students—and possibly our own.

    To engage AI responsibly, schools must consider what decision-making framework they will use to reinforce their vision in a world where AI-based educational tools are evolving rapidly. One starting point might be the PELT principle:

    Privacy
    AI works by generating consensus across large language models, which learn from the prompts and responses provided. Unless explicitly stated otherwise, users should assume that shared information is publicly available and stored internationally. Thoughtful consideration must be given to where data is stored, how it’s harvested, and who has access to it. Private information should not be shared, as it can be exploited if integrated into AI models.

    Equity
    Redemptive AI use in Christian schools (as referenced in Andy Crouch’s blog, tinyurl.com/p299th9m) ensures that all learners benefit from AI, with a focus on the most vulnerable in our communities. If resources are limited, prioritizing support for marginalized students may be the most appropriate place to start. As each person bears the image of God, we must ensure everyone has equitable access to the support they need to thrive.

    Learning
    Learning-centred AI prioritizes deep learning over efficiency or mere task completion. In education, being faster is often not being better. In a world obsessed with speed, we can draw on the wisdom of our Indigenous neighbours, who understand that learning requires patience and time. AI tools should enhance the learning process rather than be a shortcut or replace it.

    Transparency
    If teachers are uncomfortable explaining how they use AI to write report cards, they shouldn’t use those tools. Likewise, students should not use AI in ways that their teachers would consider cheating. Transparency is crucial for all AI-related practices. Teachers can use a “traffic light” metaphor to guide AI usage: red means no AI, yellow means limited use with clear guidelines, and green means all appropriate uses are encouraged. Teachers help students develop strategies for responsible AI use by setting clear expectations.

    AI applications can address many of the shortcomings in today’s educational system. AI can meet students at their level as a tutor, offering personalized support for optimal learning. It can help students edit papers based on teacher-designed criteria and provide constructive feedback. As a co-creator, AI can assist teachers with idea generation, draft challenging emails, and provide a starting point in difficult situations. However, if ignored, AI can also exacerbate existing problems. By continuing to focus solely on content acquisition, memorization, and irrelevant projects, schools create environments that encourage AI use to bypass learning while still achieving artificial success.

    Mission-driven AI implementation can mitigate these risks, inviting students and families to embrace their role in God’s story as ambassadors for faithful and responsible AI use.

    Darren Spyksma
    SCSBC Associate Executive Director