Grammar Girl Quick and Dirty Tips for Better Writing

Struggling with AI: Job loss, energy use, and more, with Christopher Penn

Episode Summary

1040. The downsides of AI bother me a lot — job loss, energy use, and the content tsunami. But I also think it's critically important to understand what this technology can do and how it's likely to change the way we work and live. In this Grammarpalooza bonus segment, I talked with Christopher Penn about how he thinks about these problems.

Episode Notes

1040. The downsides of AI bother me a lot — job loss, energy use, and the content tsunami. But I also think it's critically important to understand what this technology can do and how it's likely to change the way we work and live. In this Grammarpalooza bonus segment, I talked with Christopher Penn about how he thinks about these problems.

🔗 Join Grammarpalooza. Get ad-free and bonus episodes at Apple Podcasts or Subtext. Learn more about the difference

🔗 Share your familect recording in a WhatsApp chat.

🔗 Watch my LinkedIn Learning writing courses.

🔗 Subscribe to the newsletter.

🔗 Take our advertising survey

🔗 Get the edited transcript.

🔗 Get Grammar Girl books

| HOST: Mignon Fogarty

| VOICEMAIL: 833-214-GIRL (833-214-4475).

| Grammar Girl is part of the Quick and Dirty Tips podcast network.

| Theme music by Catherine Rannus.

| Grammar Girl Social Media Links: YouTube. TikTok. Facebook.Threads. Instagram. LinkedIn. Mastodon. Bluesky.

https://grammar-girl.simplecast.com/episodes/penn-bonus/transcript

Episode Transcription

MIGNON: Grammar Girl here. I'm Mignon Fogarty, and I'm back with another bonus episode that our wonderful Grammapaloozians got a few months ago. This one is from my October interview about writing and AI with Christopher Penn. 

In the original episode, we talked about common misconceptions around AI, and today, you'll hear us talk about concerns about: unemployment from AI, ways to overcome energy concerns, and how to write better prompts.If you didn't catch the original interview, you can find it in your feed or linked in the show notes. We do these kind of extras for Grammarpalooza subscribers all the time, so if you’re interested in becoming a subscriber to get the bonus content, and most importantly to support the show, you can find out how in the show notes or by going to quickanddirtytips.com/bonus. And now, enjoy my bonus chat with Christopher Penn.

Mignon: So now we're here. We're in our, you know, our smaller group conversation for our wonderful supporters. Thank you so much .

Christopher: All the new people are gone. Ha ha.

MIGNON: You know, it's funny because when I hear people talk about the risks of AI, you know, often they are talking about sort of the Skynet thing you were talking about, like, oh, it's going to decide humans are irrelevant and kill us all. You know, you hear that conversation a lot — that I do not worry about that. I worry about the effects on the economy and job loss. I know editors who've lost work. I know writers who feel like they're losing work. What do you worry about when you think about AI? 

CHRISTOPHER: I worry about structural unemployment. So structural unemployment means that the overall system changes in such a way where those jobs are never coming back. And I'll give you a really good example. A human being can pick a bushel of corn in about 10 hours. It takes a long time. It's painful work. It's not particularly fun. If that single human being is driving a John Deere X9 1100, that same human being can pick 23,000 bushels of corn in the same 10 hours because the machine is incredibly efficient. And as a result, that one human being's output is 23,000 times greater. What does that mean? You don't need a thousand people picking fields in the field anymore. You need one dude. It's because it's almost always a dude, who's driving around, listening to podcasts as he drives this house-sized combine through a field. That means you get more food. You get presumably at least somewhat better food. It's more standardized, but it means that the 999 other people who would have been working in that field are no longer employed there because the one guy is just, you need some people to maintain the machine and things like that. But for the most part, that job in agriculture is radically different now. 

AI is to knowledge work what machinery is to agriculture. Where, yes, there are going to be job losses, and not small ones, significant ones. And the more straightforward a task is, the more likely it is that a machine will be doing it. So proofreading, for example. Not developmental editing, but just proofreading. Like, “Hey, there's some grammar issues here.” A machine, it's been able to for a while. But machines can really easily do that. 

Now, writing first drafts, writing outlines, ideation, brainstorming, all the machines are very capable of that. In the marketing world, it's estimated that websites that publish content will lose 20 to 40 percent of their traffic because generative AI will simply consume it. If you can ask Perplexity for an answer, and you don't have to click anything. Well, why would you? Because you don't need as a consumer to give traffic to a marketer.

MIGNON: Yeah. We’re definitely seeing that. We're seeing that. 

CHRISTOPHER: Yeah. And so the things that individuals have to be thinking about is what is the human value that you provide that is above and beyond just the skill itself? Because the machines are skill levelers. 

I'll give you an example. There's a tool called Suno, song composition software. It's good. It's not great. Not amazing. Ain't gonna win a Grammy. But it's better than me, who is completely incompetent at all forms of music, right? I can barely sing, and you don't want to hear it. Can play no instruments, can read no music, and yet I can give directions to a machine, and I'll produce an okay song. 

That means if I was going to make that song, I would not hire a composer, I would not hire a band, I would not hire a recording studio and stuff like that. I would just have the machine do it. Now, would that song have been made? No, because I would not have hired them to make a silly song of some kind.

But it's a skill leveler. I don't need musical skill to make music. I don't need writing skill to make writing. I don't need editing skill to edit when I have machines that can help do those tasks, as long as I have the proficiency to operate the machine. And so for people who have those skills, what is the human value on top of your skills that separates you, that differentiates you, that is a unique point of view, that is a value add that a machine can't do?

MIGNON: Right. Because people don't always make the most rational decisions when they're hiring people or spending money. We don't look for the most efficient writer I can possibly hire. I mean, a lot of times people are writing, I know this writer, I like them, and they do good work for me.

I think nurturing the relationships you have with your clients, if you're a freelancer, for example, is going to be critically important for those people to stay employed as writers and editors.

CHRISTOPHER: Absolutely. And to the extent that you can, and this is going to rub a lot of people the wrong way, and I'm sorry for it. You should be figuring out how to automate as much of your skill as possible to get the machines to do what you do so that you can scale what you can do, so that what makes Mignon Fogarty a great writer can be bottled and reproduced inside the machines so that Mignon Fogarty can write five business books a year or 15 business books a year. 

My next book, I've just finished doing the human editing pass on it. It took me — it's a 80,000-word business book called "The Intelligence Revolution." It took me two years to create the raw materials because it's composed of all my podcasts and all my newsletters, which are all human-led, and approximately six hours to write the first draft because I said to Google Gemini, "We're going to write a book. And here's the outline for the book, which is basically my keynote talk. And here's two years of material. Your job is to steal from me as much as possible. Because you can't really steal from yourself. And assemble a business book that sounds exactly like me, composed of all my material."

And in the same way we were talking earlier about, you know, how to write a fiction story, I followed the exact same process for a nonfiction book.

And yeah, the first draft came up to 110,000 words, and it was decent. It really sounded very much like me. I had to do a decent amount of editing because there's some quirks to AI that if you, you get to see them after a while, and you can just sort of edit them out. So the editing process took about a week, and now I'm about ready to go to press on it.

That was trained on me with my stuff. But I was able to create a book out of it. And I could probably create three or four more books from it that is uniquely my style. And this is what writers and editors and everyone has to think about is how can we take who you are as a person, bottle it, and have a machine do as much of it as possible?

MIGNON: Okay. I have like three different thoughts. I'm frantically scribbling notes here. So, first, like what, what do you—there, I mean, there's already a deluge of content. So I hear people talking about how Amazon has already been flooded with low-quality AI books. And if you can write seven books a year, and I can write seven books a year, who's going to read all these books?

Like, how do you conceptualize the tsunami of content that's going to be coming at us?

CHRISTOPHER: Well, the thing that matters is brand, your personal brand, your network, your audience who is loyal to you. Those are the people who buy from you, because even before AI, I get these emails saying like, "Hey, you can buy for $149 a perpetual license to download 500,000 eBooks on any topic. And you are legally permitted to rebrand them, rename them, stick your name on them as an author, and upload them to Amazon." You know, it's like, you know, "The Money Machine Miracle" or whatever. It's always, you know, crappy self-help books and things, and keto cookbooks, you name it. And think about that.

If you wanted to just be a published author, you could spend $149, scratch off the name on the cover, upload it to Amazon, and boom, you're an author. And you've just added one more piece of text that's probably identical to the hundreds of other people who buy that exact same package and assign the exact same books.

And at that point, it's kind of like vodka, right? Every vodka theoretically should be the same two things in the bottle, which is 40 percent ethanol and water. And that's it. The only thing that makes vodka sell is marketing — how the bottle looks, and how good your marketing program is.

So what's in the bottle doesn't actually matter. With books and content, it's kind of the same thing. Yes, there is objectively better quality content and lower quality content, but we've all had that experience of reading a book and going, "How is this author so popular? This is terrible. Why did I pay for this? Can I get a refund?"

And you realize it's because the author is a better marketer than they are a writer, and that's what every author has, and every writer, and every content producer has to confront is can you build a community? Can you build your group, your conclave of people who are loyal to you and will buy what you produce, even when there's a market full of alternatives?

MIGNON: Okay. Now I have like eight more thoughts. So first of all, what about copyright, right? So what I was going to say is people are going to — certainly there will be people who build their brand on the fact that I don't use AI, that this is a human-written piece of work and you're getting my authentic words, right?

And, you know, I mean, in one of my newsletters at the bottom, I put "written by a human" because I want people to know it's written by a human, but also I do it for copyright reasons, which is something you taught me. So, the book that you wrote with AI, what are the copyright issues around that?

CHRISTOPHER: So this is interesting, and I will preface this by saying, I am not a lawyer; I cannot give legal advice. Please contact a qualified attorney in your jurisdiction for advice specific to your situation. You can tell I say that a lot. There's a difference between a generative and a derivative work. A generative work is one in which you say, "Hey, write me a blog post about blah," right?

Okay. Whatever the thing is, there is no original work to point to. If you can say, "Here's the human work that this originated in," and you can very clearly see that what the AI summarized is derivative of that, then like any other derivative work in court, the derivative work inherits the original copyright.

So if I took Mignon Fogarty's latest book and had generative AI rewrite it in my tone of voice, it's still your book, right? No matter how many times I use AI on it, as long as it is recognizable as the original work, same structure, same major points, yeah, I'm going to lose that lawsuit. The same is true for my book.

So all the stuff that's in my book is structured identical to my keynote. I did that on purpose. So you could say like, "Yeah, here's my keynote that I've been doing for two years on the topic," and the book is weirdly, structurally exactly the same. And because it's fueled from my newsletters that I wrote myself and from my YouTube channel and all this stuff, it's my work.

And so every piece of the book I can point to — here's where this part is derived from. And therefore, if it's challenged in court, I have the receipts to say, "Here's why this is a derivative work and not a generative work."

If you were to not do that and just have the machine create it from whole cloth, which is a series of prompts, then the prevailing law on AI-generated work — purely AI-generated works — would be in effect, which is that it has no copyright.

MIGNON: So another thing I've been reading about a lot is about the role of climate: AI and climate change and water use. So, you know, there are huge concerns that the data centers, the rate at which they're building data centers is ramping up problems with climate change and also the amount of water that is used to cool the data centers is terrible. 

But there are already data centers, and it's already a problem. Like tech already has a problem with energy use. I feel like, I've been trying really hard to get my head around this, and I feel like where I'm kind of coming down is maybe that AI is like gas-driven cars in that we know they're bad for the environment, but they give such benefit that we do it anyway, that we decide that it's worth it.

[OK, I have to jump in here in editing because I get mad every time I hear myself say this! I don't personally think it's worth it. I have an electric car! But I think this is how a lot of society feels. And now, back to my poorly phrased question in the original interview.]

Because, I mean, that's what I see happening. Like Google is putting this in every search; the individual choices I make suddenly feel kind of like they don't matter because suddenly Google is doing, I don't know, 20 million searches a day on AI, and my one more, what does it matter?

But I'm not sure this is the right way to think about it. How do you think about it? Because, you know, I use AI intermittently. I get the impression you're using AI all the time. So tell me how you scale this.

CHRISTOPHER: So here's the thing: it depends on what models you are using. The big foundation models, like Google Gemini Pro 1.5, ChatGPT, et cetera, the GPT-4 Omni model. Yeah, those ones require big data centers. You know, Microsoft is saying, "Hey, we need to build some more nuclear reactors just for our stuff."

Yeah, those things consume a lot of power. They do. There's no denying that reality. However, there's more than one AI model. In fact, there are close to 900,000 different models, from models that are so small they can run on a smartphone all the way to something that needs a nuclear reactor. For a lot of tasks, depending on the consumer tool you're using, there is invisibly what's called a router that reads the query, the prompt you put in, and then it routes it to the appropriate model.

So just in the Google ecosystem, there is Gemma 29B, Gemma 227B, Gemini 1.5 Flash, Gemini 1.5 Pro, in that order. They get more and more energy-intensive. If the router says you just asked me to summarize this transcript, I'm going to send that to Gemma 29B because that is a super lightweight, super fast, super cheap model that will accomplish what the user wants.

So it will handle that, and it will route it to the lowest-cost thing because one of the things that is true about energy and water consumption is that it costs money. This is where capitalism kind of comes in handy to say, like, these companies are going to do things as cheaply as they possibly can. So they're not going to put a simple request into the most powerful model, which is going to consume a tremendous amount of energy.

That's a waste for everybody. It costs them money. It costs you money. It's going to send it to the cheapest possible model. And we see that, for example, with Apple Intelligence and the way that they're outlining how AI is going to work on iPhones. When WWDC was on, they said, "You know, we're going to try and do as much on this device in your hand as possible and only route to the cloud when we absolutely positively have to," because this thing is powered from, you know, whatever you plug it into at night.

And so as the technologies improve and increase, we're seeing more usage across different levels of power. So they'll allow the models that I use on a day-to-day basis, run on my laptop. So they are, they don't, there's no nuclear reactor at my house; they don't consume any water whatsoever. It just has my laptop and the fan that's on it.

But again, as these things evolve, we are seeing differentials. We are seeing, for example, people talking about Microsoft's data center that they're trying to build for AI training off the coast of Ireland because it's just, you know, why take fresh water? We can just sink the data center deep down in the ocean where the ocean will cool it.

Another company, I think it's Blue Origin, is saying we're going to build a model training center, which is an incredibly intensive electricity and heat-intensive process. We're gonna put low-earth orbit because nothing handles heat like space. Right? You know, that's a perfect use because you can power it with the sun without worrying about clouds.

You can just let the heat radiate off into space, and there's no consequence for it. So the industry has looked...

MIGNON: The cost of getting it up there, I mean...

CHRISTOPHER: Yes, exactly. So the industry as a whole is looking to reduce costs. And obviously, when you were talking about the environment, anything that consumes energy, water and resources is cost.

So there is a built-in incentive for all of us to do things as efficiently as possible. And as the technology matures, that will get more and more of a focus because again, companies want to cut costs and make money.

MIGNON: Okay, this is great. So this troubles me a lot. I drive an electric car. I power it from solar on my roof. So you're telling me that, like, I can run a model that's really good now on my laptop that isn't using any electricity other than the solar for my roof.

CHRISTOPHER Exactly. There's a model called Mistral Nemo that I run. It's the 12 billion parameter model. It's a phenomenal fiction writer. Like it's really good, even though that's not what it's designed to do. It's actually designed to write code, but it's for whatever reason, surprisingly fluent at writing decent fiction.

Mignon: So finish up, you had this amazing blog post recently about how to make AI sound like you. And I think this is something I see people misunderstand a lot is, they play with AI, and they get out writing that they think sounds like AI and, but that really is just proper prompting. Like, a couple of weeks ago, I took a quiz about like, "Can you recognize a review of a hotel? Was this written by AI or a human?" And I did terribly. I mean, I got 9 out of 15. Basically, I was barely better than chance at identifying AI-written versus-human written reviews. So, you know, how, if you could just talk about this post you did, which was just so great, about how to adjust the tone of AI so it writes like you.

CHRISTOPHER: The issue with writing is that this is something that I think the average writer would struggle with. What is your writing style? That wasn't rhetorical. What is your writing?

MIGNON: Oh, um, fun and friendly, objective, crisp maybe. I hope, I hope it's all those things.

CHRISTOPHER: So the reason why that's a bit of a trick question, so I apologize. Writing style doesn't exist. Writing style is an umbrella term for things like clarity, conciseness, audience awareness, purpose, voice, tone, diction, formality, specificity, imagery, paragraph structure transitions. There's so many aspects to writing style that we don't think about when we write.

We just kind of do it and assume that, you know, it all just comes out in the wash. Machines don't know that. Machines cannot understand that. But what you can get a machine to do is say, here's a whole bunch of my writing, like a couple of hundred thousand words. Here's my writing. Here are the 20 to 25 components of writing style.

Create a detailed analysis of my writing style, right? And then once the machine does that, it can say, like, "Hey, you write like this" and use diction, the variety of words that you use and stuff. And then you can create again, very long prompts that say, “Here's the task, here's my writing style that you analyze as four pages of analysis of my writing style. Here's three pages of example.”

Because a writing style analysis doesn’t ever replicate the writing style fully, and you would say now with these examples of how I write with the guidelines about my writing style with all the stuff now start to write like me, and you will get a much, much better simulacrum of who you are from that set of techniques, and you know, like we were showing earlier with the sensitivity reader, you want to have it even build a scorecard to score your specific writing style so that you can say, "Yeah, yeah, you didn't do it right."

You rescore and try again, and have models try that. So that's how you get AI to write like you is to say, first, we're going to define what writing is. Again, remember, smartest intern in the world. Still an intern. And then we're going to analyze my writing, and then we're going to build a scorecard so that you know whether you're doing my writing well, then I'll give you some examples of my writing, and now we can write like me.

MIGNON: Awesome. This has gotten to be a really long show, but thank you so much for staying with us, for giving us this much time. It's just such an interesting discussion, and you're one of the most knowledgeable people I know about this topic. So I hope this was as interesting and useful to all the listeners as it was to me.

Christopher Penn, people can find you at Trust Insights and you know, I highly recommend your newsletter, the Always Timely, no, Almost Timely.

CHRISTOPHER: Almost Timely.

MIGNON: Almost Timely newsletter. Thank you so much, Chris.

CHRISTOPHER: Thank you for having me.

And thank you, listeners, for joining us today. The Grammarpaloozians who support the show really help make this work possible. I hope getting a taste of the benefits will help you make the decision to sign up and be part of the team. If you're listening on Apple Podcasts, you can sign up right in the app on the Grammar Girl landing page, and if not, you can learn about other ways to sign up at QuickAndDirtyTips.com/bonus. That's all. Thanks for listening.