GC Dispatches, Pt. 1 – An AI Question
My quick answer to a dinnertime inquiry
As I mentioned previously, I've been at the General Convention of the Episcopal Church since Friday. For non-Episcopalians, this is our every-three-years governance gathering. And a giant church trade show comes along for the ride, which is why I'm here.
It's been as overwhelming as usual for an event like this, and a bit more so with a four-year-old in tow. But the children's program has been wonderful, and I've enjoyed our little daddy-daughter road trip.
I'll do my best to share a couple of "on brand" reflections here in this newsletter looking at the playful side of learning, faith, and media.
Today I thought I'd say a word about AI. At dinner with some friends and colleagues on Saturday night I was asked what I think about these emerging technologies. I'll definitely revisit the topic in more depth in this newsletter, but here's a quick summary of what I said:
We need more specific language – When I served in the Center for the Ministry of Teaching at VTS from 2012–2016, I was often introduced as the person who did "all things digital." You've probably heard the phrase yourself, and it borders on meaningless.
That's the first thing I think of when I think of AI: It's a fair-enough umbrella term, but it covers a staggeringly wide range of applications for machine learning. What I think about using AI to improve energy efficiency or identify promising new drug treatments is way different from what I think about AI junk filling up social media feeds to so bot nets can generate a higher return for their owners. Which is a good segue to my second point ...
We need to be proactive about combatting negative ecosystem effects – There's no question that generative AI is filling up the internet with junk. You've probably noticed what a wasteland social media has become, and part of the problem is that lots of people have left or cut back on usage and engagement, but part of the problem is that it's really easy to flood the place with garbage marketing content written, drawn, spoken, etc. by various kinds of AI.
Generating credible content for "close to zero" cost is both a ridiculous pipe dream and too close to quasi-possible for companies not to try moving in that direction. As Ezra Klein and others have been screaming into the void: PLEASE PLEASE PLEASE find ways to actually pay for as much of the content you love as you can afford to do, because a lot of outlets are going to close down in the next few years. (You think church leaders have it bad. Say a prayer for journalists.)
And it's not just that the junk is crowding out the good stuff. Another big ecosystem danger is that search is going to become even more useless than it already is. For years, search-engine optimization has encouraged people to care more about gaming the Google algorithm for traffic than actually providing valuable content. And yes, now people can do that in even more grotesque ways with generative AI.
But the bigger problem is these AI search tools that have been popping up, including from Google. There are predictions that they could cut web traffic by double-digit percentages for many publishers who rely on search. It does feel like at some point we won't be incentivizing anyone to do the hard work of actual human writing and reporting, and these AI searches will get increasingly bad because they won't have new, reliable content to draw from.
Speaking of Human Work ... It's Good Actually? – You've perhaps seen one of the memes to the effect that it sucks that AI is making art and humans are delivering food on poverty wages. Amen to that, of course. I checked out one of the 24-7 streaming AI MTV site things, and it generates some interesting video sequences and some serviceable beats. But knowing that whoever is "at the wheel" is just dialing in parameters or whatever and not actually intending anything just makes it feel very very said to tune in.
I've been using AI tools for a few tasks for the last year or so and will probably identify others over time. But just like taking effortful, summative notes is a more meaningful learning activity than simply recording and re-listening to a lecture, actually writing / researching / planning / creating a playlist / whatever is almost always going to be a more cognitively beneficial and artistically satisfying experience than outsourcing the labor to AI.
Sometimes we don't need benefit or satisfaction, we just need to get things done. And I think that's fine in the light of some personal critical reflection. But let's be very clear that very few short cuts actually serve us in the long run.
Again, I'll double back to all this in a future week with a LOT more sources. But I've been thinking more or less constantly about these questions for months, and having the occasion to share with my colleagues the other day prompted me to write some of them down.
If you're at General Convention, come say hi at the CDSP booth! I'm there about half the time, and much of the rest I'm just wondering around catching up with folks.