This District Is Building an AI Chatbot—But Not for the Students

Industry,

By Lauraine Langreo

The Arcadia Unified school district in California was already building a basic chatbot specifically for staff use at least four years before ChatGPT became the hottest AI tool on the market, recalled Greg Gazanian, the district’s chief strategy and innovation officer.

But the district paused the project when the pandemic hit.

Then, in 2022, OpenAI publicly released ChatGPT, and generative artificial intelligence became easily accessible to nearly everyone.

Arcadia Unified decided to pick up the development of its chatbot where it left off and experiment with applying a generative AI model to it.

For the past year and a half, Gazanian and his team have been testing and piloting the chatbot with a few staff members. Soon, he said, they’ll open up its use to any employee who wants to try it out and give real-time feedback so they can make improvements.

Education Week first spoke to Gazanian about the chatbot in February and then received an update about its rollout in October. Gazanian also discussed why and how his district built the chatbot and offered advice for other districts looking into ways to leverage generative AI.

This interview has been edited for brevity and clarity.

Why did the district decide to build this AI tool internally?

What we found is developing it internally lets us do a few things: It lets us really focus on creating an AI bot that can respond in ways that are very customized to our teachers and staff. For example, it can give advice about our purpose and values, it can give advice about our curriculum. If we need it to answer slightly differently, or if we need to add new resources, it’s not something where we have to call a company, we have to negotiate with them.

It also allows us to pick the AI engines that are right for us.

We feel good about where we’re at in terms of privacy, because there is no middle layer. We don’t have another company who’s going through our data, then sending it off to a model. We’re processing the data ourselves.

Developing it in-house has really allowed us a lot of flexibility and a level of customization that a lot of other places haven’t necessarily been able to do and you don’t generally have when you’re working with a third party.

There probably aren’t a lot of districts with the expertise to build a tool like this internally. Could you talk about how you’ve made it work?

At the end of the day, we’re able to do it because we have a collection of—and I mean this in the more fun way—hackers in our organization, meaning we’re tech people. We love trying new things, we love fiddling with it, and we love trying to get it to work. We also have a board of education that’s very supportive. We have a real commitment to trying new things and tight feedback loops and being willing to stop doing things if they don’t work.

Internally, it’s myself and a few other members of the tech department who do more of the technology side. In the general staff in our district, we have people who are very comfortable trying new things and giving feedback, and we’ve built this mutual trust where we’re willing to explore that.

Is this AI tool a ‘walled garden'—meaning it’s trained only on vetted resources rather than all of the internet?

Currently, it’s not. We did not clip the wings of the GPT-4o model yet—which means, if you ask it something broad, it’ll pull information in [from other parts of the internet]. One of the benefits of that is when you’re writing letters, when you’re writing an analysis, having that broader corpus of information is really helpful. For example, if we’re writing an email to parents about the benefits of AI, if it was just looking for our data, it’d be like, “AI can be useful in this very small context.” Instead, it can say, “Here are five case studies about how AI is helpful in schools.”

We encourage people to read through it, make sure it makes sense, do a little bit of checking before you use it. We do put some protection there. For example, [we’ve trained it so that] if it doesn’t know the answer, it doesn’t make it up.

Los Angeles Unified bungled the rollout of its AI tool. How are you ensuring that doesn’t happen?

I’m not super knowledgeable on what happened internally. What I will say is any time you’re using a third party to do your development, there’s always a risk there. They could go out of business, there could be business disagreements, there may be a misunderstanding, there may be a change in scope. One thing that we have going for us is, because we are developing this internally, we may still run into challenges, but it’s all within our organization.

The other thing that’s really important to how we go about things is engaging our stakeholders really early. We have teachers working with us on this development internally, whereas maybe in another instance of using a third party, the third party’s never met those teachers. They’re not working with them, they’re not in the same room. In a lot of cases, in our district, we’re working with people we’ve known for five, 10, 15, 20 years. We have that trust.

And we’re comfortable playing the long game and taking enough time to make sure we do this in a way that makes sense for a smooth, continuing rollout.

What has been the feedback from staff who have used the tool?

They really like having it easily available. It also saves money. They don’t need to pay for a ChatGPT license. They really like having it trained on our own data. They like how customizable it is without having to add a bunch of extra data.

They give feedback when something’s not running so well, too. Our early version of the chatbot had no memory—meaning they could ask a question, they’d get a good answer, but if they asked a follow-up question, it wasn’t working super well. There are some things that need to be a little bit more polished if it’s going to be comparable to these other services they’re paying for.

People feel so much ownership when they can give that feedback and see it change, and that’s where it really starts to feel like a community that’s developing something together.

What advice would you give to other districts?

Have a good “build versus buy” conversation. Do you have the resources today to start this? Do you have the resources down the road to continue to support this? Having a conversation up front about whether you’re willing to commit to whatever the development and maintenance arc is over time.

Who are the people we need to have in the room for starting this? Do we have a tech person who’s excited and knowledgeable about it? Do we have someone on the curriculum side who can give us effective feedback? What is the community’s perspective?

But the other thing I’d say is reach out. I’m happy to chat. I know anyone else who’s working on these projects is happy to chat. Everyone I’ve talked to who’s working on this stuff does it because they love it, they’re intrigued by it. They recognize that the world we live in is changing and is going to continue changing because of this new technology.

Recognize early on that you’re not alone in this. Let’s build a community around it. We want to do this because it helps our teachers and our staff.

Lauraine Langreo is an Education Week staff writer, covering education technology and learning environments.

This District Is Building an AI Chatbot—But Not for the Students