|   | 

DIY AI: Key Considerations (Part 1)

AI has never been more accessible, and it will impact every corner of the independent school business office.

Aug 22, 2023  |  By Cecily Garber, NBOA

From the September/October 2023 Net Assets Magazine.

Artificial intelligence

This article appears online in two parts. The Part 2 is "DIY AI: Business Applications."

It’s turned upper level classrooms upside down and inside out. It promises to catalyze the next tech gold rush, and is spawning new side hustles each week. According to AI Impacts’ 2022 survey of machine learning researchers, there is a 10% chance human’s inability to control it will lead to our species’ demise. If that doesn’t happen, it will likely reinvent the way we work or else require us more simply to do more work, given productivity gains. That is, if it doesn’t get reined in by copyright, privacy or security regulations first.

Our eardrums have been ringing with the explosion of buzz around generative artificial intelligence since November 2022, when OpenAI released ChatGPT, short for Chat Generative Pre-Trained Transformer. It is now the fastest growing consumer app in history; by January 2023, it reached 100 million users, who were eager to test its ability to generate largely articulate if not always accurate text responses to seemingly any question or prompt (by contrast, TikTok took nine months and Instagram 2.5 years to reach similar user levels). Technology that was once the sole province of highly trained data scientists and engineers has been opened up to anyone through ChatGPT and many other services like Microsoft’s Bing, which integrated OpenAI’s technology into its search chat; Google’s Bard; Anthropic’s Claude and other providers that allow users to create text, images, videos, spreadsheets, slides and other content.

You’ve likely tried it out yourself. Early adopters like Isaac Judd, chief financial and operations officer at Gann Academy in Waltham, Massachusetts, have already found myriad uses. When we spoke in May, Judd was tapping ChatGPT to research best practices in a variety of agreements and contracts — to get a sense of what existing documents may be missing — with full understanding that the final product would need to be vetted with human expertise, as would any agreement drafted by an employee, he pointed out. Judd was using the tool for drafting surveys and as an editor for campus messaging, as it can process context in a way tools like Grammerly and Word do not (yet). Judd has found it serves as a helpful training tool for colleagues who are looking to expand their knowledge around Excel or coding. Rather than having to watch a video or sort through search results, colleagues working with a generative AI tool receive precise answers to their questions and, in some cases, can simply cut and paste code into the document they are working in.

Members of NBOA’s Editorial Advisory Committee shared in April that they were using the tool for first drafts of job descriptions and employee verification letters as well as for creating meeting summaries and supporting school store inventory management. It can be useful to non-native English speakers who may otherwise need more time to draft text in English, they said.

“Nothing is soup to nuts; it won’t finish anything for you,” Judd said. “But it’s interesting to see sometimes the different perspectives on whatever it is I’m asking it and how it can synthesize information.” We spoke just after news broke of the Colorado lawyer who submitted a brief with invented cases generated by ChatGPT, and Judd was fully aware of the tool’s limitations. But Judd has long looked for ways to be more efficient in his work, and he is excited about the road ahead. “One of my core beliefs is to question, How can we improve? How can we make things quicker, easier, more efficient and better for the end user? I could immediately see implications around a lot of different departments I work with.”

“Thus far, we've seen a range of responses to generative AI among business officers, from those who have not yet engaged, to those who use it every day in ways that I find quite innovative,” said NBOA President and CEO Jeffrey Shields. “Regardless of how you feel about AI, its myriad uses are still quickly emerging, and it’s imperative to learn about it, so that you as a leader are aware of its potential benefits and pitfalls.”

Key Considerations Today

Much of the public discourse around generative AI thus far has centered on the classroom and how teachers and students negotiate the use of the new tools in learning. But there are clear implications for the business office as well. Ashley Cross, Ed.D., ATLIS’s senior director of education and content, is ardent about the impact of the technology and also cautions users today. “This will very much transform all of our daily lives,” she said, “but then the question is, when should we use it?” Here are some of the key considerations she laid out:

  • Privacy: Schools hold highly sensitive information, which should never be fed into open access tools like ChatGPT or Bing, as the data goes back into the system that generates answers or new content. While “shiny promotional videos” from companies like Microsoft show how lists can be transformed by pressing a button, business officers must ensure those lists don’t contain data that should be private to the school, Cross said. Likewise, the tools could be used to draft sensitive emails, about a late payment for example, but never put in the real family’s name; use a placeholder.
  • Cybersecurity: Generative AI cuts both ways in this regard. Network monitoring can be more highly automated, but new tech can make it quicker and easier to “exploit specific vulnerabilities, within your website, for example, crawling and writing code to exploit the exact vulnerability that you have,” Cross said.
  • Fraud: Generative AI has the power to unleash deep fakes that could potentially mimic the head of school’s voice, for example, to ask for payments and transfers of money. With just a three-second voice sample, tech can now recreate that voice credibly, Cross explained. “Social engineering is going to be bigger than ever.” Likewise, it will be critical for schools to develop strong fraud protection protocols.
  • Bias: Any tool that uses an algorithm may have bias baked into it (see sidebar at right for more on this topic). So when AI helps screen candidates or is used to develop a performance review, users should be wary about what biases may be influencing decisions. “Amazon, for example, used an AI tool to vet their candidates, and it pretty much wouldn’t hire women, and so discontinued the product,” Cross said. This spring the EEOC passed guidance clarifying that employers are responsible for ensuring that the tools they use in HR matters are not discriminatory. Even if the vendor claims the tool is free of bias, the employer is liable if they are found to have perpetuated discrimination.
  • Insurance: Due to all of the above, insurance policies may change to account for these new risks and ways to mitigate them. (For more on the impact of AI on insurance broadly, see section on insurance on page 23.)
  • Classroom implications: AI “really can provide a personalized learning experience,” said Cross. She explained how the University of Georgia implemented an AI chat bot to check in with students and send reminders that improved the grades of first-year college students by 11 points on a hundred-point scale. But she cautioned that the technology needs to be taught to students so they understand how to use it properly. “Just because [students] are comfortable playing around with something does not mean that they have any sense of guidelines or boundaries for how to properly utilize a tool. It’s our responsibility to teach them even if we are learning alongside them.”

Cross also underscored that the technology has had a huge impact on teachers. “The headlines were saying, ‘English Class is Dead.’ That happened overnight, before English teachers may have even heard of it. A lot of emotions come with that.” School leaders should be aware of these concerns and support faculty through challenging times.

Generative Thinking About Generative AI

All of the above is critical to know now, but school leaders may also be wondering what the larger implications of this technology will be on the business office down the line.

“In futurist thinking, we don’t like to ask, What are the biggest impacts [of a certain development] for us as an organization? But rather, What are the biggest questions [we should be asking]?” said Greg Bamford, senior partner and co-founder at Leadership + Design and a former independent school head, administrator and teacher. After all, “you can’t predict the future with a high level of certainty.” In the face of potential change, Bamford urges leadership teams to spend more time imagining possible futures to develop more “cognitive flexibility as a team.” When teams are thinking about a range of outcomes, they “are less surprised when something happens. They’re able to react faster and in more useful ways,” he explained.

Take the potential impact of generative AI on the student program, Bamford said. That may seem distant from the direct work of the business office, but it could significantly change how resources are used, including physical space, technology and faculty. The ideal profile of a teacher may change. The staffing model, onboarding processes, performance management and professional development may change as well. In fact, the Spokane school district in Washington state is already implementing AI to provide more regular teacher feedback than lead teachers can provide, as outlined in an EducationWeek article this spring.

Furthermore, developments in AI could potentially alter the independent school business model, Bamford said. “I don’t think generative AI is yet at the point of disrupting the cost structure for schools,” Bamford said, but it is time to consider asking bigger questions: What’s the work currently done by humans that no longer needs to be done by humans? And what is the work done by humans that only humans can do? How do you use generative AI to create more richly human environments?”

While current technology likely is not strong enough to make roles redundant now, that could be the case five to 10 years down the line. “Having an awareness of where needs might be going allows you to take advantage of a transition when it happens in a natural way,” that is, through retirement or an employee moving on, rather than laying people off, he said. The work of the registrar, for example, could change with more highly developed AI. And relatively small impacts in costs per year, like a 1% decrease in expenses, can have strong ramifications down the line, he noted.

If your team is having trouble getting started with these kinds of questions, a tool like ChatGPT could itself help spark divergent thinking. The cover story of the July/August issue of Harvard Business Review, “Gen AI and the New Age of Human Creativity,” addressed this very topic in terms of idea generation in business. The authors argue generative AI can be used to augment and enhance divergent thinking by challenging expertise bias, generating novel ideas, supporting idea refinement and more.

Read Part 2 of this article: "DIY AI: Business Applications."


Author

Cecily Garber

Cecily Garber, Ph.D.

Associate Vice President, Communications and Member Relations

NBOA

Arlington, VA

Cecily Garber is the editor of NBOA's Net Assets magazine, and directs NBOA's publication efforts, which includes books, reports and industry guidance. She also oversees the communications and member relations team, which is responsible for all membership, marketing and communications efforts. 

ON THE HORIZON

15

years is the target ceiling for a school plant's financial "age."

Get Net Assets NOW

Subscribe to NBOA's free twice-monthly newsletter.

SUBSCRIBE