Generative AI Is All the Rage. Handle With Care.

Generative AI is all the rage and church leaders may want to dive right in. But be cautious–there are a lot of unknowns.

Editors’ Note: We updated this content on July 25, 2023, with additional information along with comments from Jonathan Smith, president of technology consulting firm MBS, Inc., and director of technology for Faith Ministries in Indiana.

A year ago, few knew the term “generative AI (artificial intelligence).”

Fewer could define it.

Today, it’s leading the conversation. OpenAI’s fast-spreading ChatGPT and DALL-E, Google’s Bard, Meta’s LLaMA, and Microsoft’s ChatGPT-powered Bing have quickly put these language-based chatbots into the mainstream.

Exponential growth

ChatGPT surged past 100 million users less than two months after its November 2022 debut. Facebook, by contrast, took four-and-a-half years to surpass the same mark.

Even with preliminary signs of slowing growth, many agree these chatbots are here to stay. As one technologist observed in Harvard Business Review, “We are at the beginning of another technological revolution.”

With such a revolution underway, church leaders should keep some legal and risk management considerations in mind, including intellectual property, misinformation, privacy, defamation, cybersecurity—and even hiring.

Another tool—but not just any tool

Technology in ministry has evolved a lot over the past 25 years. Some leaders recall debating how their churches should build their websites.

From there, text-based messaging and church management software appeared.

Soon after, social media sites and online giving tools popped into focus.


A Pastor’s Take: Lead Not Your Church into Fear of AI


Then came smart phone apps to handle anything from communications to donations to administrative tasks.

Generative AI is different not only for what it does, but also in how quickly it evolves.

“Generative AI systems fall under the broad category of machine learning,” notes global consulting giant McKinsey & Company. McKinsey then asked ChatGPT to describe itself: “This nifty form of machine learning allows computers to generate all sorts of new and exciting content, from music and art to entire virtual worlds,” the resulting response said.

Learning as it goes

In other words, chatbots learn from questions and information submitted by users through a simple chat box, the chatbots’ own online searches, and user feedback. They then create new and better content for future queries.

The process is a self-perpetuating, iterative loop.

For instance, ChatGPT 3.5—launched in November of 2022—was already significantly inferior when ChatGPT 4 released just four months later.

To illustrate, ChatGPT 3.5 scored in the 10th percentile for the Uniform Bar Exam used by many states to license attorneys. ChatGPT 4 scored in the 80th percentile.

Such rapid change has triggered much wonder regarding storytelling, art, music, and more.

It also has generated much concern, ranging from cybersecurity threats (such as more sophisticated phishing and malware) to school cheating to existential threats to humans.


Does all this talk of gen AI have you thinking about other IT-related issues? Consider picking up a copy of Nick Nicholaou’s “Church IT.


Church leaders should start thinking now about how these content-creating chatbots will shape their ministries, whether driven through their own initiatives or thrust upon them by outside forces.

Here are some early considerations to note:

Intellectual property

Churches and pastors create lots of original content. Sermons. Children’s plays. Worship music. Website fodder. Social media posts.

Looking to generative AI chatbots for help requires extreme caution, though. Chatbots aren’t necessarily pulling together fully original creations. As they scour sources provided by a user, as well as readily available information online, they can easily grab material verbatim as they go.


Think About It: Christians are asking ChatGPT about God. Is this different than Googling?


That means chatbots most likely are grabbing and using pieces of works owned by other people or companies. Any resulting uses very likely violate copyright law—and courts have found that can be true even if duplications involve only a few notes or phrases.

Leaders must recognize the potential perils involved with instructing a chatbot to create a sermon about, say, The Beatitudes. Or to orchestrate a worship song based on the influences of current chart-setters. Or to generate a script for a Christmas pageant.

Not a ‘super search engine’

In a white paper, OpenAI openly discusses the way ChatGPT “hallucinates” as it works—another way of saying the chatbot sometimes creates fictitious details to support the task it was given.

One New York lawyer learned this the hard way. He asked ChatGPT to write a brief on behalf of a client suing an airline. He submitted the document. When the airline’s lawyers couldn’t locate any of the cases, the court asked for more information. Turns out, the brief cited fabricated cases and statutes to try and support the client’s case.

The attorney, facing sanctions, said he believed ChatGPT operated like “a super search engine.” It doesn’t. Leaders again must carefully vet what a chatbot produces for most any task.

Subtle inaccuracies are as much a threat as outright fabrications, too.

Here’s an example:

We asked ChatGPT 4 to write about NFL quarterback John Elway’s greatest game. The chatbot quickly answered Super Bowl XXXII, the hall-of-famer’s first championship win. It supported its response with his game statistics, but erroneously listed ones he earned a year later during his team’s Super Bowl XXXIII victory.

Defamation

Generative AI also poses potential defamation concerns.

In June of 2023, a Georgia radio host sued ChatGPT. A journalist-generated query produced a ChatGPT response that included a legal complaint about the host, and allegations he embezzled from a gun-rights group. The legal complaint, though, was fake, another example of a chatbot providing unpredictable—if not false—information.

This situation also reveals other ways churches and pastors may encounter less-than-ideal situations with the technology, says Jonathan Smith, president of technology consulting firm MBS, Inc., and director of technology for Faith Ministries in Indiana.

A generative AI query using your church’s name, or your pastor’s name, will draw from myriad sources, including negative posts, comments, or social media across the web.

It also could do damage in unexpected ways. “Generative AI has no intuition, no understanding,” Smith says, adding “it will draw the conclusion your church or your pastor is bad.”

Privacy

OpenAI discovered a bug with ChatGPT that exposed user chat histories along with payment and contact information for some of OpenAI’s premium subscribers. The problem was fixed. But when users now sign up, they see an onscreen message warning them about the sharing of any sensitive or personal information.

For churches, this again provides an important reminder about getting consent from congregants before publicly sharing prayer requests.

A well-meaning pastor or staff member, crafting the next church website update or e-newsletter, might contemplate pouring the text of requests into a chatbot to create the message. Doing so places potentially sensitive information about individuals into the chatbot. The public disclosure of private facts is the basis of an “invasion of privacy” lawsuit.

Lack of user support and feedback

There isn’t an immediate remedy available.

A user can submit feedback indicating a response contains faulty or false information. But, as Smith points out, other platforms like Facebook, Twitter, and Instagram struggle to keep up with user feedback, and ChatGPT and its peers likely will, too.

But leaders should still note this added concern from generative AI’s unpredictability. Exercise extreme caution especially before using any AI-generated responses containing information about other church and ministry leaders.

Cybersecurity

The powerful learning fostered by generative AI offers possibilities for good, including medical imaging and weather forecasting. There are also possibilities for more sophisticated crime. One cybersecurity company executive told the ABA Journal “you’re going to see phishing emails that are so believable you don’t know you’re talking with a machine.”

These schemes use an email to impersonate someone with authority and trick the recipient to click a malicious link or transfer funds to an unauthorized bank account. A misstep can prove costly. Education, training, and best practices can help pastors and staff thwart these attempts.

On the positive side, new generative AI tools are available to determine whether a message originated from a person or a chatbot, including one from OpenAI (note, however, that it requires a minimum of 1,000 characters to analyze).  

Job applications, references, and schoolwork

Does your church ask job applicants for responses to short-answer questions? What about recommendations from past employers or professional references?

Does your church run a school and regularly grade schoolwork?

Note the developing tools for sorting out original content from computer-generated content, which is becoming an industry unto itself.

Matthew Branaugh is an attorney, and the content editor for Christianity Today's Church Law & Tax.

This content is designed to provide accurate and authoritative information in regard to the subject matter covered. It is sold with the understanding that the publisher is not engaged in rendering legal, accounting, or other professional service. If legal advice or other expert assistance is required, the services of a competent professional person should be sought. "From a Declaration of Principles jointly adopted by a Committee of the American Bar Association and a Committee of Publishers and Associations." Due to the nature of the U.S. legal system, laws and regulations constantly change. The editors encourage readers to carefully search the site for all content related to the topic of interest and consult qualified local counsel to verify the status of specific statutes, laws, regulations, and precedential court holdings.

ajax-loader-largecaret-downcloseHamburger Menuicon_amazonApple PodcastsBio Iconicon_cards_grid_caretChild Abuse Reporting Laws by State IconChurchSalary Iconicon_facebookGoogle Podcastsicon_instagramLegal Library IconLegal Library Iconicon_linkedinLock IconMegaphone IconOnline Learning IconPodcast IconRecent Legal Developments IconRecommended Reading IconRSS IconSubmiticon_select-arrowSpotify IconAlaska State MapAlabama State MapArkansas State MapArizona State MapCalifornia State MapColorado State MapConnecticut State MapWashington DC State MapDelaware State MapFederal MapFlorida State MapGeorgia State MapHawaii State MapIowa State MapIdaho State MapIllinois State MapIndiana State MapKansas State MapKentucky State MapLouisiana State MapMassachusetts State MapMaryland State MapMaine State MapMichigan State MapMinnesota State MapMissouri State MapMississippi State MapMontana State MapMulti State MapNorth Carolina State MapNorth Dakota State MapNebraska State MapNew Hampshire State MapNew Jersey State MapNew Mexico IconNevada State MapNew York State MapOhio State MapOklahoma State MapOregon State MapPennsylvania State MapRhode Island State MapSouth Carolina State MapSouth Dakota State MapTennessee State MapTexas State MapUtah State MapVirginia State MapVermont State MapWashington State MapWisconsin State MapWest Virginia State MapWyoming State IconShopping Cart IconTax Calendar Iconicon_twitteryoutubepauseplay
caret-downclosefacebook-squarehamburgerinstagram-squarelinkedin-squarepauseplaytwitter-square