In this episode, book coach and business advisor Jackie Lane reveals why writers shouldn't fear AI but should instead take agency over how they use it. As the founder of the HII (Human Intelligence and Imagination) Project, Jackie shares insights from...
In this episode, book coach and business advisor Jackie Lane reveals why writers shouldn't fear AI but should instead take agency over how they use it.
As the founder of the HII (Human Intelligence and Imagination) Project, Jackie shares insights from both sides - as someone who actively uses AI while also championing the irreplaceable value of human creativity.
Jackie explains her helpful approach to setting AI boundaries and shares surprising discoveries from interviews, writers, and clients. She also reveals why writers shouldn't worry about their jobs and what they can do instead.
Whether you're a content creator or an aspiring author looking for some assurance and guidance, this conversation offers fresh perspectives and ideas for an AI-obsessed era.
LINKS:
On LinkedIn - Jaqui Lane
Jaqui's website - The Book Advisor
Article - The Hii Project
Scott: Thanks for joining me today. Jackie Lane is a book coach and business leader advisor who focuses primarily on self publishing and other ways that writing can increase your impact, recognition, and visibility. I discovered Jackie after she had written a LinkedIn article on February 1st called The High Project, that's HII, A Human Agency Project for Writers and Publishers. And I was fascinated by this. This idea of a project for writers and publishers centered around making sure that we don't lose our humanity at a time where AI is so increasingly popular. Now I should point out, I don't want to scare you off here. If you're a big fan of AI, this is not an anti AI episode by any stretch.
As you'll hear in our discussion today, Jackie is a big fan of AI and has gone out of her way as a passionate writer and consultant to learn all about it. And obviously you can find plenty of advice about it and you can find plenty of people telling you how to use it in different ways, some good advice, some bad advice.
But I think what Jackie is talking about here is there needs to be a larger conversation on the human side of writing, publishing, and content about maintaining the humanity in it all, not only for our own creative outlet, but also our audiences.
And it all kind of started by just seeing all of this talk out there, not only about all these different things that AI is doing in the realm of writing and publishing, but also talking to other writers who were becoming increasingly fearful , about how it was going to impact their work or their lives.
Jaqui: I'm a business writer, um, myself. And, um, another part of what I do is that I, um, mentor and advise people on how to publish their own books. So I'm in the writing space and the self publishing space a lot.
Uh, and so I've been watching really over the last, well, since Jack TPT came out, you know, wondering what it is, how, how easy is it to use, what potential threat might it, um, mean for content creators. And I had a client a year ago who worked with me on a project about writing a book using ChatGPT to learn how to write a book. Um, so I actually learned. that process.
Uh, so a lot of the people on my sphere Writers particularly have been saying, asking me, what did I think about chat GPT? Was it going to take their jobs? Um, I've been concerned all along of actually the training, what, you know, how, um, the companies behind chat GP and others have been scraping the internet for their data and what that means for writers in terms of their IP, um, in terms of protecting it and in terms of how do you, how do you kind of stop that and make some money out of it?
I mean, there's plenty of people out there fighting the copyright and IP part of this. Way, way bigger resources than me. But I was kind of waiting for someone to kind of stand up and stand out and say, Hang on, everybody.
Um, we're letting this thing happen to us. We need to become more proactive. We need to. Get out there and, um, tell the, you know, I guess, take action about how we can re, re, regain or in fact, maintain our agency. So, it's a, it was a slow burn thing and I guess over the Christmas holidays and had a bit of time on the beach down here in Sydney, I thought, well, you know what, if no one's going to do it, I'm going to do it.
I've got the saying that, you know, hope is not an action plan and, and in fact, hope is waiting for somebody else to do something. And I thought actually, well, you know, I'm gonna, I've got to stop hoping I've got to do something. So that's kind of where, that's how it ended up.
Scott: Not that long ago, I had talked about where I thought we were in the AI content creation process right now. I felt like we had had several months when we were just overly excited about it, trying all different things, trying to infuse it into our company, our processes, our content creation, and then we started to notice some of the drawbacks of that, and we've cooled off a little bit.
The other things that we've heard about is just how people are taking full advantage of this in a way that's just Putting more content out there that's boring and uninteresting and doesn't resonate with people and how that's become a challenge.
So if we put all that aside, you almost have to ask yourself, are there two groups of people out there using AI? We have maybe one group that couldn't care less. about all this humanistic stuff and all this human connection stuff. They just want to have the tools, create the content, be happy that they got it produced in seconds, they put it out there and they're calling it good, and they're okay even if they only get like a 0.
5 percent return on it. At least it did something. And then there are those who want to find that balance between, yes, I want to be able to optimize my content, or I want to be able to do certain things faster,
or I'm going to take what AI did and humanize it so it does have that element that can connect and resonate with people. And if we can believe for a minute that that's how this breaks down, those are the two groups of people that are creating content today. Which group is bigger?
Jaqui: .
I'm not sure that there's fewer of them than, than the humanists, uh, and certainly what I have seen more and more of is, um, that there are a group of people who, , what I call gaming the system, um, so, you know, that they're getting chat GPT to it. right books and and selling them on Amazon.
And in fact, I think mid last year, Amazon put a limit on the number of books someone could upload in a day. And I think that limit is three. So they get you actually pay to write a book, shove a cover on it, stick it on Amazon and sell it for three ninety five and market.
Um, so I think there are always going to be people like that who see an opportunity to make some money. Um, and I've seen them. Um, I'm on a couple of business book award judging panels and I've had to judge a couple of books that I am pretty sure that were all done by chat GPT. Um, with an author's name on it.
Um, so I, you know, I'd like to think that there are more of us people who like humans, um, and human to human connection, but, um, I'm not sure. And, um, it, it concerns me, which is another reason why I just thought, well, I've got to at least get out there and start having the conversation, uh, at a wider level.
Because people will always take advantage of whatever technology is out there. And, and, and whilst I don't like it and I feel people are being ripped off, um, it's not illegal. Yet. Or ever, possibly.
I think most of us have become really more skeptical and we don't trust much.
Because we're not sure if we're seeing a deep fake or, um, if a book, if something's been written, has been written by a human, not that humans always write, you know, perfect, you know, um, you might not agree with what a human rights, but at least it's kind of, you know, it's human. So I think there's a level of distrust that, that these AIs and virtual things have.
created, which I think is a real shame, but I think that's where humans can excel. , you know, I'm really big on content. That's that does something not just for the sake of it. The world doesn't need more stuff content.
It needs, my view is that it needs Um, good content that helps people enjoy their lives, make their lives better, make the world a better place. Uh, it doesn't need more crap content, as I say. Um, so I think consumers are starting to get there. I think, um, but I think it's still a real concern that, um, people are misusing, um, and abusing, um, a lot of the AI.
Again, particularly in the content space for me writing, but also across, um, video, art, all the rest of it.
I'm pretty good at picking what's been written by AI . It uses what I regard as overly flowery corporate speak, and I really dislike corporate speak. I write about. Corporate speak a lot because I really hate it.
Um, so I can pick it because I can, I see these terminologies and these phrases. And, and when I'm working with the writers and authors that I work with, I, I make them, if they're using it, I get them to get rid of it. Um, not for the AI reason, but because. Corporate speak is really not designed to provide clarity.
In fact, it's designed to obscure and as a foot style of writing. I don't really like it.
Scott: Now if you're a big fan of AI and you heard her say get rid of it, you might have had a huge gasp. But I don't think she's necessarily saying never use it because she's also pointed out ways that she's used AI. It just has to be used for the right things and for things that really make the content better, not just for you, but for your audience.
I mean, Jackie's been talking about this pattern of people having AI write the book and just putting their name on it. Andy Crestodino was on this show, and he talked about how, yeah, let it do that all day long, but it's not going to connect with people in the same way.
Because the reason why people seek out certain authors is because they have a style, or a way they frame things, or something that really grabs them, or resonates with them, or inspires them, and one person may say this author is inspiring, but that one isn't.
And the reason is different humans are going to connect with different humans. What I think AI is changing is how writers write,
not only in the way that we write prompts so that we can optimize the content in a way that really works, but I think we're just going to have to evolve in our ability to humanize our own writing just to stand out even more with so much other stuff out there to compete with.
Jaqui: Having just read this book I've read, I would like to think that we humans are special and, and I think we are, uh, and that You know, novelists or whatever, you know, Simon Senex or whoever, um, have a particular, um, approach and, and that they articulated a certain way. Um, AI is learn, is, is creating models for it to learn, um, on its own without us.
So I think we're not there yet with that. I think there will come a time when , um, it can. Write a Ken Phillip novel, and you possibly wouldn't know the difference. From what I understand, that's a way off yet. Uh, but again, I think the thing for me is that, we humans need to take agency over what we're plugging into these models.
And understand that everything we plug in is helping the models to learn, uh, and therefore we have to take ownership of what we actually share with them and what we choose not to, uh, and, and take that responsibility on and, and instead of saying, Um, AI is horrible. It's going to take my job or, you know, it's going to whatever is be really thoughtful and considered about how you interact with it, what you ask it to do, what you feed it, what you share with it and what you don't, um, because what you put in, it might use to push out for somebody else.
I'm quite big on this kind of. ethical framework, I guess, and also understanding that, that it's also up to us. We can't blame it, because we're the ones with the agency. And this is the discussion I've actually been having with a lot of people, particularly people who say to me, I'm really scared about my livelihood and this.
And I said, well, firstly, I don't think you should be, I mean, You know, I can understand why you have concerns, but I said, but you have the agency here. Um, you know, you're the one interacting with it. You're the one asking it to do things or setting up prompts and stuff like that. I said, you need to sit back and think about what level of responsibility you're going to take with this because you can make those decisions.
Going back to being human, you know, the, the art, the act of reading a book is a, is the brain synapse and how you absorb and helps you slow down, you know, I mean, there are, there are. any number of studies that talk around, talk around that. But, um, as I mentioned, I think in my, my article around the High Project, you know, I think with physical books, um, and the human part of it, we want to actually connect with authors if we can, which is why writers, writers festivals are so popular, certainly here in Australia.
And, you know, I've got a couple of client book launches next week. And just this thing about, Somebody's obviously spent, um, time and effort to write their story or whatever that is, but people want to connect with the person physically. You know, and so when they can do that, they do. And when they can't actually be in the same room with them, a kind of printed book is almost like they are doing that.
And there is something, I think, innately Very special that, that us humans like around it. Chat GPT and, and, um, perplexity and Claude and all these things that are out there, as I've said, are really just tools. Um, and yes, they can be, tools can be used for good or bad, or you know, whatever.
So we've got to develop a different mindset around what, what it is, um, what it can do and what it, what we don't want it to do and, and understand what we want to put into it or what we don't. I mean, I'm a corporate historian with another hat on, you know, and I interview hundreds of people every year.
None of those interview transcripts, none of those audio things go anywhere near any AI, you know, they are never going to see the light of day like that. And that's a very conscious decision of mine. And also a privacy issue, but there's a whole bunch of stuff that, um, I have that I will never share, um, on, on a digital platform for a whole host of reasons.
So I'm making very clear decisions about how I interact with AI and I use a lot of it, but there's, but, but I'm very clear about the purpose and what I do and don't share.
Scott: Yeah, I think Jackie's point about people, humans, taking just a little more responsibility or more control over how they use AI is a huge point. I mean, think about, I mean, I don't have to tell you. You could probably think of a couple movies or TV shows that have warned us about getting a little too obsessed with AI or technology in a way that starts to hurt humanity in maybe small ways or big ways.
I mean, I think of shows like Humans. Have you ever watched the show Humans? Or obviously movies like Terminator. But I think, you know, while some of us might be saying, Okay, world, let's keep this in check. I mean, at the very least, as content creators, for our businesses or just for our audiences, we can say, you know what?
We're going to take a little more control over this because it means something to us. As an organization or as a content creator, I'm going to say these are where the parameters are when it comes to how much humanity and how much technology is involved in my content creation.
We don't want to learn some things the hard way. We don't want to learn things when it's too late. And granted, we may not have to worry about something, at least not now, of the magnitude of the machines or the AI rising up against us or something.
I'm in the process of writing a book right now, and I already knew some of the things you had to really keep tabs on when it came to AI, but I've also learned about some new ones along the way.
I am doing some academic research. And sometimes I may have read over those academic papers, you know, several weeks ago or a few days ago, and I don't want to read over them again. , but I know what's in them, so I will put that into chat GPT or in AI and say, you know, hey, I'm making this point.
about this element of content or this element of, you know, humanization. And when it comes to this study, can you find some things in this study that help me back this up? Because I know it's in there. And I've gotten answers back that says, Yes, this expert says blah, blah, blah, blah, blah, and it's writing it out.
And then I look at that and I'm like, well, wait a minute, that doesn't sound like what that expert said. That sounds like me. And I will have to go back and say, are you sure that's what this expert said? And then it'll be point blank honest with me and say, oh no, this is not exactly what he said in the study.
This is just me taking what you want and putting it in their context. And I have to go, well, don't do that. So the last thing you want to do is be giving it so much power where suddenly it's misquoting somebody or making a connection that's not there because one of the things that it's so interested in doing is making you happy.
That's why it talks so nice to you, and it may be doing things like that. So you really have to make sure that you're keeping AI in check and not look at it as this perfect thing that is so much smarter than you that it can't make mistakes because it can.
Jaqui: I was chatting with somebody the other day and they were saying that they were arguing with ChatGPT about something, you know, because it wasn't giving them what they wanted.
And I said to this gentleman, who I didn't know very well, I said, why are you arguing with it? I said, that's not the way to use it. And he said, well, I've used my 40 prompts for the day, you know, and it's doesn't, it hasn't given me what I want. And I said, well, how are you prompting it? And he goes, oh, I asked it to do this.
And I said, well, that's not, and what context did you give it? And he said, what do you mean? And so I said, well, it's actually your responsibility to learn how to use the tool. And, and I think that raises a kind of different. angle for us, not, not our job, is that people need to train themselves or go and get trained on how to use this properly, whatever tools they're using.
Now, I know they market it to say, oh, you can just get in there and play around, but that's not responsible, you know, for either you or humanity.
Scott: So as we revisit some of the challenges associated with AI and content, and maybe we look into some new ones that maybe we haven't considered before, you can kind of see why Jackie decided to come up with this agency or agency project called the HI project, and that's H I I, as a means to maybe be a rallying cry for writers.
And content creators who want to find those boundaries. They want to take agency. They want to take a little more control. Where we recognize the best ways to use AI, while also making sure that we don't lose our humanity in the process. And I think what she's looking for is to see if this can be something that like minded creators and writers can come together And developed to not only get other people involved, maybe raise awareness, but also continuously show what kind of great content we can make, whether it's books or anything else, by keeping those two worlds in balance.
Jaqui: It kind of started because I've just finished reading, um, Uwe Noah Harari's just written a new book called Nexus, um, about information from the Stone Age to now. And so I've read that over Christmas as one does. That had this whole book's really an existential discussion about information is not truth.
More information doesn't make it more truthful. Um, so that got me thinking, um, and I've been really toying about, again, taking a proactive role for, for particularly people who are in the writing and the publishing space, but not exclusively. And I kind of just thought of hi as in hello, because it's a very human and very simple.
Thing that we all kind of do. And I know that chatbots do it too now, but , if we can't even feel that we're, we're talking to a real person, I think, you know, humanity's got an issue. I, I like chatbots. It helps me solve things. But, you know, I know it's a chatbot.
And I think that line between, Yeah. Knowing that you're dealing with a chatbot or AI and or a human, um, and being, and needing to know, um, so I thought, well, it would be kind of. I was trying to come up with a name for this thing that I wanted to do. So I thought, well, human, you know, uh, intelligence and imagination, you know, so that's why you've got two eyes.
I think the imagination part is, is particularly human. Um, I, I can, it can't imagine it just scrapes the dark and scrapes the net might look like it can imagine things, but it can't. So that's a very human thing.
I just kind of wanted to put it out into the world. So there's no big organization. I haven't got grand plans for it. I just thought, well, I'd like to seed it out into the world and see if either I'm just on some random loan, loan thought process of my own, um, or, or maybe it might get some resonance and, and that maybe I could bring just people like you and me together and, and it might turn out to be something a bit bigger.
That helps or, um, starts conversations around what being human and what this interaction with human and AI is all about. It was like, let's send it out to the world and see who, see if anybody responds and sees who responds.
It possibly is logical that I should create a LinkedIn group. Um, cause I'm very active on LinkedIn. And it may be that if that goes, I, I create its own little hub, quite frankly, you know, that's not difficult.
And I'm actually talking to the, the author who wrote the book with me about how to write a book with ai, um, I thought might do. And he's a, a real whiz at, um, the whole deep understanding of, um, prompting and, and, and the back end of it. Peter is his name. And I thought, and Peter and I thought we might run a series of.
videos. To take a deeper dive into how you might do that, why you would, why you wouldn't on different platforms and start building up a little free library of stuff like that. So do some educational type stuff. Maybe do, um, have some kind of, coordinated stream, stream yard or, or sessions, uh, and, and just create little hubs of things.
If in fact, that's what people want, uh, so it's going to be very much highly unusual for me to take that, I guess, iterative approach, because I'm usually a project manager that gets things done and has a clear goal, but I'm, I really want to resist that. I want to see, is there a need? Do people want anything like this?
And, and kind of, What do they want? And and based on that resonance or otherwise, I'm I'm happy to kind of take it wherever it needs to go.
Scott: You know, I've heard Jackie talk about some of the discussions she's had over the years with writers, especially book writers and content writers, and she said in recent months, there's this like undying fear of, of just, you know, there's not going to be a need for me anymore.
And some of them are so down about it. That they almost don't want to write anymore, because it's like, what for? I mean, it's clear that people just want AI to do that now. And I, I think it's really important to maybe Pause that,, because this isn't the first type of conversation we've had on this show about this.
And I bet there's some things in our discussion that you like and you agree with, both regarding the use of AI, and the importance of humanity and creativity,
and I don't think it's a big stretch to think there are a lot of people like that. I mean, we're all trying to figure this out in different ways. And there's plenty of room for writers who want to take this approach. Where we say, yes, we want to have AI, but we want to figure out the best ways to use it.
And the best ways to maintain humanity, not only for our own work. But as we point out on this show, I personally do not think there is ever going to be a time where audiences, your audience, says, you know what, I don't care anymore, just give me something to read. You know, and I think that's one of the main reasons why this project is a good idea, and why Jackie tells those writers, don't despair.
Jaqui: Don't despair. You know, we humans are still kind of here and active. Um, that, um, I think take agency over, um, over it. Uh, and I think that shift in mindset is really important, you know. I have discussions with people a lot that it's doing, AI's doing this to me or it's going to take over my job.
Um, no, it won't. Um, you know, I know a number of writers, and who've got good, well paying jobs, who, who are doing their writing on the side as they're paying jobs. I mean, I think there's a huge demand, even more so, as we've talked about. Um, With for humans to connect with humans, so be positive about it.
And and I guess my second thing is learn what the AI are and what their limitations are and what they can be used for. I use chat GPT and various other things depending on What I want from it, um, and, and the only kind of way to do that is to get in there and do some training and learn and, and experiment, um, rather than being fearful of it.
You've got to get in there, and there's lots of free versions of these things, and just start playing around, so you get to understand a little bit of it. So those would be my two core things, and don't argue with the AIs because that's not helpful for you, you know, and they don't know really how to respond.
I'm innately human. I intend to stay that way. And I love the art of creating through writing. So it's my absolute passion. And I just want to let everybody know who's like that, that you should still keep doing it.
Book Coach, Business Adviser, and Self Publishing Expert
Jaqui has interviewed over 900 Australian businesspeople, published over 400 business books, written 25 company histories books and over 200 articles. She is one of Australia’s leading business historians and self publishers having researched and written books for the Commonwealth Bank, Westpac, Amcor, Cleanaway, Woolworths, Patricks, Peter Lehmann Wines, Wilson Transformers, Myer and the Asia Centre Foundation.