Your Students Are Already Using AI. The Question Is Whether You're Ready to Help Them Do It Well.

I recently stood in front of a room full of faculty at the SLCC Faculty Research Conference and said something that probably made a few people uncomfortable.

Not because it was radical. Because it was true.


The Debate About AI in Education Is Not Really About AI.

It is about what we believe learning is for.

And right now, a lot of us have not answered that question honestly.

We have policies written in fear. We have zero-tolerance rules in classrooms where students are already using these tools every single day — not to cheat, but because that is the world they live in. We have faculty who are genuinely trying to protect academic integrity while accidentally producing graduates who are unprepared for the workforce they are about to enter.

That is not a technology problem. That is a belief problem. And it starts with this:

If we believe learning is about demonstrating compliance with a process, AI is a threat.

If we believe learning is about developing the judgment, communication, and critical thinking students will need to contribute to the world, AI is a tool we should be teaching them to use.

Those two beliefs lead to completely different classrooms. Completely different graduates. Completely different outcomes for the communities we serve.


What Employers Are Not Asking For

Here is what no hiring manager in 2026 is going to say to your graduate:

"Put away your tools. Show me you can do this by hand."

Here is what they are saying:

"Here is a problem. Here is data. Here is an AI platform. Show me your thinking."

That is the room our students are walking into. Not the one we built our syllabi around. Not the one some of our policies were written to protect. The one that already exists, right now, waiting for them.

If we graduate students who can only perform in a world without AI, we have not protected them. We have left them behind — with a diploma in hand and a skill gap nobody warned them about.


What Responsible AI Use Actually Looks Like

I did not just talk about this at the conference. I showed it.

Over the past few WEEKS, I built Money Moves University using AI at every stage: market research, curriculum frameworks, partnership strategy, pricing models. AI made me faster, more informed, and more competitive than I ever could have been working alone.

But here is what AI never did. It never made a single decision. It never understood the communities I was building for. It never knew when a financial concept would land differently with a first-generation college student than with a banking professional. That was always my work.

That is the model. AI as a thinking partner. A tool that accelerates what you bring to it — not a shortcut around bringing anything at all. The human judgment in the middle is not optional. It is the whole point.


Three Things You Can Do This Semester

Audit one assignment. Pick one where students could use AI productively. Redesign it so their judgment is required, not just their output. An oral component, a revision log, a "why did you keep this?" prompt. Something that proves they understood what they submitted.

Add an AI transparency prompt. Ask students to document what tools they used, how, and what they changed. That is metacognition. One of the highest-order skills we can develop — and it costs nothing to add.

Try it yourself first. Before deciding AI threatens your course, spend thirty minutes with ChatGPT, Claude, or Copilot on your own subject matter. See what it gets right. See what it gets wrong. You cannot lead students through something you have never experienced.


The Students Are Not the Problem

Here is the part that should keep educators up at night.

Our students are not hiding AI use because they want to cheat. Most of them are hiding it because we told them to. We created an environment where the honest answer,"I used AI and here is how," is more dangerous than the dishonest one. We punished transparency and called it policy.

And while we were doing that, the workforce moved on without us.

The students who will thrive are not the ones who avoided AI. They are the ones who learned to direct it, question it, pressure-test it, and build on top of it. They are the ones who developed the one thing AI cannot replicate: judgment forged through real thinking, real stakes, and real accountability.

That is what education has always been for.

The tools change. That does not.


One Question Worth Sitting With

If a student spent three hours building a beautiful, perfectly formatted presentation with zero original thought — and another student spent twenty minutes prompting AI, then two and a half hours critiquing it, restructuring the argument, and adding their own research and perspective — which one learned more?

You already know the answer.

The question is whether our assessments do.

Next
Next

Money Moves University: Why I Built It and Why It Matters Now