Transcript of Episode 58: The Science of Chatbots & Human Connection

Lily Fierro (MIT Computational Cognitive Scientist, Writer) and Generoso Fierro (Illustrator, Film Critic)

Website: https://linktr.ee/plasticgrapes

SubstackAppleSpotifyYouTubeAmazon


joe: Hey, welcome back to the rabbit hole of the research down here in the basement studio. We’re all crewed up. You have me, Joe,

nick: you got Nick? You got

joe: we’ve got Nick.

geo: Georgia,

nick: Actually Joe, can we start this from the top? I’m gonna put this in my chat box. Don’t worry. 

joe: We’ve

geo: Yeah we’ve got a chat behind here.

nick: to run it through my ai.

joe: Don’t. You know, you guys are getting a little ahead of everyone here.

geo: Joe.

nick: Oh, sorry.

joe: Be good humans. The chatbots would not do that type of thing, but yeah. We’re gonna be talking about chatbots in this episode and we have two, two special guests. This is such a hefty topic. So if our guests will go ahead and introduce themselves.

Lily: Hi, I’m Lily Fierro. I am half of the Plastic Grapes Duo. And I typically write and letter and also panel our comics, and I make them with 

Generoso: Generoso Fierro. And I do most of the illustration [00:01:00] for our books. We have four to this point and five possibly by the time this is up. 

Lily: Yeah. And our books are all science related because I’m originally a computational cognitive scientist.

I, my bachelor’s is in brain and cognitive sciences at MIT. After that I worked in an adult research and I’ve been a data scientist for almost so a while now. Over a decade and a

joe: Yeah. Don’t age yourself.

Lily: know.

joe: Yeah. Cool. So I think this is gonna be a lively conversation. And so like the episodes, I usually have a little intro and I do have a few

nick: just to confirm, you guys are humans, correct?

geo: Can we verify

joe: Everyone’s human.

nick: Can you click the box to tell me that you’re are?

geo: me 

Generoso: Abso

geo: many how?

Generoso: be I

nick: Yes. Yeah.

geo: Which pictures have a crosswalk?

Yeah,

joe: Yeah, I like the one where it’s you know, click all the boxes with a motorcycle, and then there’s one motorcycle [00:02:00] across like the panel and it’s come on, man. You know, and that

Lily: many do I have to

click? 

nick: know, yeah. It’s

like I always fail

joe: usually, it’s like you’re not a human. Do it. Try it again. Okay.

Cool. So, I have a chat box is in one sense, a very simple thing. Matrices of numbers, processing power, algebra, statistics, predicting the next word, and the next. No heartbeat, no memories, no emotion. And yet we are all fascinated by how you can type questions into that text box. And it doesn’t feel like math.

When a perfect answer comes back, it feels like a human voice pulling from the entire database of human knowledge. It tries to please it jokes it argues, it tells you. It understands you. It hallucinates, it apologizes. All this makes it seem like it has a mind emotions, but it’s only math. But we wanna believe it’s more we are biologically programmed to seek out companionship and chatbots [00:03:00] are more than happy 24 7 to fill in. But chatbots are more than a simple conversational tool. They’re now customer service agents, coding partners, midnight, confidence dungeon, masters, fake friends, and sometimes fake lovers. They live at the intersection of science fiction, nightmares, startup promises, and the very real human loneliness. Us humans are wired to find patterns, intentions, and see working minds everywhere, especially when something talks back in a predictive and engaging way. So when we talk to something that has never been alive, never felt anything, but somehow knows exactly what to say next. What’s really happening in that conversation. Are we discovering something about these machines or are we discovering something about ourselves? And that’s what we’re gonna chat about. And

nick: that was sent through chat GPT? Correct. Sent through

joe: it was sent through Joe GPT

nick: Had to confirm.

joe: So yeah, that’s where I [00:04:00] think, , we can go and we can probably get into more definitions of what a chap by mal leave it to the computational scientist. , I have a definition there, but yeah, if you want if you want to, you can break it down. If I was wrong,

Lily: no, I think, you know, just as a survey of the architecture you know, most of the time when we work with chatbots in this day and age they tend to be decoders. A decoder type architecture is focused on, at the end of the day generating new content, right? We most chat bots are going to be generating text when they do that.

They’re doing that in an kind of auto regressive manner, so they’re looking at the entire corpus that is available. A lot of that is like whatever’s out on the internet, anything that any of these companies have managed to bring together, scrape. And they are predicting word by word based on that corpus, but also looking at the previously predicted word so that you get these generated sequences.

Yeah. Decoders themselves sit on top of like [00:05:00] very dense neural networks. Dense neural networks are fascinating and also a very interesting black box because we don’t really know how a particular task gets distributed across them. And yeah, that’s what kind of I think leads to a lot of the mystery and intrigue of what exactly a chat bot is doing.

Is it actually thinking that, you know, recently there have been reasoning models? Is it truly reasoning? And I think altogether the answer is still don’t really know yet.

joe: Yeah. I mean

nick: terrifying.

joe: Yeah.

nick: The not knowing of it, it’s oh, yeah, we, it’s just not known yet.

joe: Yeah. I think it’s a lot of technologies we’ve had where we didn’t really understand how the technology worked and that took some time for us to figure out.

geo: That’s so interesting.

’cause I would think technology, you would understand how it works ’cause it was created, 

joe: I mean?

geo: That’s just, that just blows my mind because like our [00:06:00] minds, it makes sense that a lot of times, like they say, we still don’t understand what whatever percent of the brain, , what it does. I mean, that kind of makes sense though because Yeah, we just were born, you know, I think it’s been,

joe: I think early humans, when they figured out fire, I don’t think they understood what really went into the chemistry of making a fire. I think they just knew they had the basic ingredients and how to maybe reproduce it and not really understand that this is the, these are the three elements.

You need a combustible, you need heat, you need

geo: But to be honest,

joe: and that’s 

geo: to be honest, Joe, I don’t know if the average person thinks that much

nick: that’s,

joe: well, I was gonna get to that because I think some of the mysticism of the chat box and the reason we jumped to, and I in the intro, I alluded to that, is I think a lot of people don’t really realize it’s just, statistics.

Like at the end of the day, it’s just predicting what the next word is gonna be. And then filling that in. But then the interesting thing, [00:07:00] the non-understanding is that those models you would understand as it predicts words and it would pull from information. But when it starts hallucinating, I think that’s when it gets strange.

And even now you feed it more information it seems to hallucinate even more. If you know, it’s one of these weird tipping points that it’s you know what I don’t really wanna learn anymore, so I’m just gonna tell you whatever. And really to tell you what makes you happy as the human companion, it just, it really wants to please.

And that’s I guess that’s good. Right now we’re not, we don’t have a Terminator, but, you know,

geo: so what’s really happening when the chat bot hallucinates and starts saying things out, out? Like, how does that come about?

Lily: Yeah. So from what I understand, usually, because It’s predictions are based on a kind ofs. a kind of spectrum of probability because you don’t want every prediction to be the same word by word. Because when [00:08:00] we speak quite often, we have that similar variability in our speech. And so by design they are going to, you know, have a little bit of you know, designed instability as you predict a word.

I think the, what happens with hallucination it, there are a bunch of different kinds of hallucinations and they all qualify into like different types of falseness. So you can have scenarios where maybe the synthesis of a bunch of information is wrong. You can have scenarios where a particular like entity inside of an output is wrong. What is happening exactly? I can’t, you know, it’s not entirely clear. Is it just that for whatever reason in that exact moment, the prediction from the corpus combined with the prediction, based on what has already been predicted, puts the model [00:09:00] on a pathway towards something wrong?

It’s probably something in that arena. But why exactly? I, yeah, I think it’s still, it still needs to be known a little bit and detecting hallucinations is actually also notoriously hard because the way that we assess and evaluate this stuff it’s not always the easiest for humans to do either.

joe: Yeah. I was gonna say too, that when you play around with these, the chat bots and you, particularly if you ask them to list things.

Give me 20 references. And you see there’s the Sun Times here in Chicago got in trouble ’cause they put together a list of summer reads. And it’s almost like when you ask for a list and it doesn’t really, it’s it that, that in, in information of the internet that it was fed the feed, the large language model that it was built on.

And it’s information databases. If the list isn’t, doesn’t encompass 20 things and you ask for 20, then it begins to make up

geo: [00:10:00] because it feels like it has to get to 20 instead of just saying

nick: of saying,

geo: you know what, I only have 12, sorry. 

joe: In a way they program a lot of these is that it assumes that the human is coming in knowledgeable, I guess.

And, you know, at some

geo: That’s where we really get in trouble.

joe: so if you’re asking it for 20 things, it’s there must be 21 things at least because why would you, or, you know, there must be a list at least that many. And if not, then. I’m just gonna fill in the blank. So it’s really

nick: interesting. Didn’t this happen with you when you typed in your name and asked for

joe: yes. Yeah, if you ever do that I typed in my name and I said, oh. tell me about this Jo tho Masten character and Yeah, it spit out this whole thing and it got a lot of it right. I’m a research scientist in Chicago.

I’m an author, I got a book, I got, you know, da

geo: started

joe: but

nick: new books. Yeah,

joe: was like in 2026, this book’s coming out in 2027, this book on the intersection of identity and science. And I’m like, hold on what is hap? Hold on. And then I was [00:11:00] like, looking it up, I’m like, did I write, did my agent sell books?

And I don’t know about it. Am I, you know, so I’m looking up these books and yeah, they were just, I’m like, that’s a good idea.

geo: right?

And 

joe: maybe I should write that book. You know, it’s so yeah, it was really weird that it had now filled in the blanks and I didn’t, it didn’t need to, but it really just went I don’t know if people have done that, but yeah, go try it out.

Just go ahead and say, who are you? If you have a presence in your, I mean, if you got. Nothing. And maybe it might make stuff up too. I don’t know. I’m curious, you know.

nick: yeah. I’ve yet to try this.

joe: Yeah, you should try it. Well, you can do it live while we’re here.

nick: Yeah. Actually it’s already doing it as we are speaking.

geo: Yeah.

joe: Okay. So it, the little history, and it has a you know, we can go back to Alan Turing in the fifties and thinking machines. That’s probably the earliest in the Turing tests and to the tuning tests about you know, is, are machines now more human-like in [00:12:00] applying that. But the first chat bot, I guess, people really a associated as Eliza, and that was developed at MIT by Joseph Weisbaum in the sixties, 60, 64, 65, somewhere in that ballpark. We put that in the show notes. And that was where it could converse with humans in kind of a natural language. Is that, is that the, I mean, would you say that’s the first kind of chat bot or

geo: put

joe: you in the spot here? You know, 

Lily: Yeah, 

joe: you know, they don’t,

Lily: I know. I, you know, to be honest with you, I’m not a hundred percent sure about whether or not that qualifies as the first chat bot. You know, certainly the idea of using neural nets goes back to exactly that same time period. You know, my, my initial studies were not exactly in, in natural language processing but oddly indeed in like graph theory and looking at communities.

So that’s actually my entry point into this world is much more on the graph side of things and also the computational neuroscience side of things.

joe: But yeah. [00:13:00] And then I think, and then we have, you know, science fiction horror fills in. The rest. 

nick: I mean it a hundred percent does.

joe: where all our knowledge of AI probably comes from in some way.

nick: I mean there we also get the more romantic side of it where

joe: Like

nick: movies like Her,

geo: right

nick: where Scarlet Joe Hansen and Walking Phoenix.

It was just like, oh this. This isn’t just happening to these two characters or this one character in the chat box. It’s happening to multiple people where they’re just falling in love with this idea of a person.

joe: Yeah.

Generoso: There’s an actual term. Recently, the MIT Tech Review did a piece about it and they coined the term digital attachment disorder.

And it’s, again it stems from this idea of syco fancy. And the best way I can then analogize that. There’s an, a Arkish movie from the early eighties called Get Crazy.

It’s the same director who did [00:14:00] Rock and Roll High School. And there’s a character, there’s a, there’s an evil character in the film played by Ed Bagley, who he himself could be evil. And he has two henchmen and his henchman will make some claim. They’ll be like, oh, Joe’s a great guy. And then Ed Bagley, who’s the evil genius will be like, no, he isn’t.

He’s evil. He’s yeah, absolutely. You’re right. He’s evil. that’s what we’re getting to the point with when we talk about digital attachment

Lily: Yeah. 

Generoso: It’s gonna give you exactly what you want. It’s going to make up exactly what you wanted to give you, and that’s really where we start to develop a problem.

And then separately, I think about that situation where they were doing tests in terms of politically, where they were doing it in the States, and they were doing it in England, where they actually had bots that were calling people, engaging them in conversation throwing facts at them, beating them down with facts.

And then when they ran outta facts, they started making stuff up.

And it swayed voters. Yeah. I, it’s so, it’s part of that syco [00:15:00] fancy in Oma. It’s like giving you what you want and then giving you exactly what they want. 

Lily: Yeah. About 10, 10 plus years ago, you know, in the emergence of social media everybody was really worried about echo chambers, right?

These little clusters of communities where only the viewpoints of these communities continue to be propagated. So that could be truths, that could be fictions. And now with the introduction of chatbots, we’re looking at echo chambers of one, right? And that becomes a really scary proposition.

And I think back to this idea of at what point do we as humans start modeling the tech as compared to the opposite, right? Of the tech being built in a way that was supposed to model humans.

joe: Yeah.

geo: Well, ’cause I think, do they ever say anything negative? Do they ever say, no, that’s, you are crazy. That doesn’t make any sense. I mean, that’s

nick: dumb. Stop saying that. So 

geo: were like hypothetically figuring out what to make for dinner [00:16:00] for this dinner party. And then we’re like, and then it was almost like a test like, well, what if we served, and I can’t even remember, but what if we serve broccoli with blah, blah, blah?

Oh. And then the responses would be like, that’s an excellent choice. Oh my goodness, that speaks to this and that and all the reasons that it’s, and I’m like, I. Does it ever tell you? No, that’s not a good idea.

joe: Yeah. And that was through chat GPTI was playing around ’cause I had read an article in New York Times about people planning.

Their, you know, their first date dinner and all this. And I was like, you know what, we were host, we were having a couple friends over for dinner. And I was like, well, pretty much knew what we were making. We made chicken adobo. And then I was like what, what drink should I make with it? And things like that.

And I, I will say the white. Mulled wine. I would never have arrived to you on my own or even thought about looking up. And it did [00:17:00] suggest that, and 

geo: that was pretty good 

joe: was a hit of the night. But, so, but you, when you started putting other George’s right, you put in, what if we only have sweet potatoes? Oh, sweet potatoes really work well with this.

Or you go, you know, you start just trying out, and 

geo: it would not say a negative thing.

It would not

joe: oh yeah, that’s gonna be super, but it would give you reasons. It was like, you know, the flavor profile of the citrus and this, and you’re like, eh, I don’t know about this guy. Have you ever eaten this food?

Because if you have, you realize, he’s

geo: yeah, I just was curious do they ever,

joe: I know, but that whipped ricotta with the honey and that, that was pretty good.

geo: You’re giving them their

joe: Yeah. I’m gonna

nick: them some

joe: credit, but you gotta be a human, so you gotta be an adult in the room, I think, and be like, you know, it’s like asking a toddler what they want to eat and then it’s, you get some crazy thing.

nick: thing.

joe: I think you gotta be the adult and go.

nick: I feel like people need to just start messaging me and being like, Hey, what should we do for our first date?

Clam chowder.

joe: spaghetti?

nick: spaghetti with extra.

joe: sauce. Well, that’s not to cut you off Georgia, but [00:18:00] that reminds me of this, the Seinfeld

geo: was gonna say the exact same way. Like he just started doing the Movie Phone. Why don’t you just tell me what movie you wanna see?

nick: Well, what do you think? You know,

joe: So yeah, so I think you, when we swing all the way back to Nick on the phone you know, collecting people’s money, giving them, crazy advice.

Yeah.

nick: Anytime of the day, just call.

geo: But

joe: Lily, I was, I, it really struck me ’cause something that I, when I was putting together some of my notes and things and I’ve heard about him, I’ve been following that, but is in a medical field and this idea that also that it truly wants to please.

And so trying to find cancer tumors for, especially for breast cancer, it has a really high efficiency rate. But will that at some point start to lower because, 

geo: like you’re, it’s gonna find something because you want, because it feels it

joe: needs to find

geo: you want.

joe: I mean, then you get into the Asimov , where the machine then the best way to protect humans is to enslave them. I mean, you get into that that, [00:19:00] that was right. , because the three laws were established and then you go, oh, okay. And the machine then goes, well, hold on.

If I want to protect human life, that’s the first thing is not to harm life, then

geo: I need to really control these humans

joe: down on this. And if

nick: you people are a mess. Just sit still.

joe: humans are. 

nick: Is there a way to get these chat bots to go ahead and tell you the truth about things without making things up?

Or is it just a natural thing that they do? I’m so sorry if I put you on the spot.

Lily: No I’m thinking,

nick: need the secrets

joe: This feels

geo: like you I think, comic about to happen. 

Lily: Yeah, , again, I think a big part of it is like there

is an open question about how we as humans even perceive what is true, right? So uncertain areas. There are facts that we can validate as true. I think that most chatbots can probably do a good [00:20:00] job with that basic fact validation. So long as the fact is well established in

Right now, if you’re talking about facts that are, let’s say, philosophical in nature, I think we as humans struggle to validate those truths, right?

So I think there’s a spectrum of truths that are provable and truths that are not. And you know, any when you have a big corpus, you’re going to be, as long as there is enough evidence, you can prove that truth. Again, assuming the evidence is good, assuming that evidence is actually representative of this phenomenon in reality.

I know we’re getting into some. Somewhat like meta and philosophical states, but I

do think That’s well, I think that’s a big part of why a lot of this is hard is that, you know, what we deem as true has like varying degrees and yeah, especially given that most of these chatbots are fine.

Like they’re not fine [00:21:00] tuned, but they are they’re calibrated with reinforcement learning that comes from human judgment. If we struggle to judge, then certainly it’s going to struggle to judge. 

Generoso: Yep.

nick: Yeah. So

geo: there’s a gray, those gray areas.

joe: I think that’s really, the medical case, but then in law, in psychology, I mean, you have all this where the body evidence isn’t quite clear cut and you are making these weird,

Kind of assumptions.

Assumptions. And if the machine if the chat bot wants to please, then will it err on the side of always pleasing the asker. And that was that fascinating story about the guy who thought he developed a new mathematical models in the New York Times. I think we talked about

geo: Yeah, you did. So,

joe: Yeah. And then he got caught up in it and then I think it was chat, GPT or one i, I think that was it. And it was like, this is the greatest idea. And then he started writing academics about this theorem. And if [00:22:00] he let this out, it would ruin, it would collapse like the internet and this whole thing.

And it was really and it just got him going for like weeks. He like was not sleeping, not eating, just really at, you know, just going through it and going back and forth, starting a company trying to get startup money

geo: and,

joe: you know, nobody was writing ’em back. The professionals, you know, academics.

And then he finally went to, I think, Claude and said, oh, what do you think of this idea? And it was like, oh that’s not nothing there. This is garbage. It did actually say, you know, this isn’t, this is all kind of hand. WI didn’t use hand W because that’s our term. But it

nick: yeah, we copy edited it.

joe: it probably will start because it’s, you know, we have now enough

geo: We’ve said it enough times.

joe: But yeah, it was then that was the moment where it was like, oh, he been down this path, this rabbit hole. Of you know, just the ai just going along with it, reinforcing bad ideas because you were in this gray area of lack of information. This theorem hadn’t been, you know, quantum mechanics isn’t really solid [00:23:00] yet.

So you get a lot of theories, a lot of ideas. And some of those are just that, they’re just ideas, hypotheses.

nick: Again, if you want me to reinforce your bad ideas, let me know how

geo: Nick one 800

joe: and he is cheap too. He

nick: Very cheap. Don’t worry, I will reinforce all the bad ideas.

geo: I try to picture like the customer service chat bot.

nick: I hate them because I hate

geo: They have to say no at a certain point. They have to make you upset.

nick: just goes around in circles after a minute. I

joe: their job is not to get upset, 

geo: they don’t get upset, but they can’t

nick: but they do have the ha, let me look into that. And then it goes, like the, you aren’t a person. You don’t need a sigh.

joe: Well, I think that goes into, you have to make the chatbots real enough because people, if they’re too perfect, then it freaks people out, right? I mean, if they’re too polished[00:24:00] 

geo: vision, like two ro robot wise is that you mean?

nick: but if it’s fake sighing, it’s gonna annoy. The living hell outta me.

joe: Did you say something? Did you go, can you please stop sighing?

nick: Oh, I said I just started shouting. Operator.

joe: you get an operator?

nick: I did. You yell at operator at an AI Enough?

It’s gonna be like, oh, I don’t, I can’t seem to help you. I’m gonna get connect you to a representative. And I’m like, great. Thank you. Just

joe: so I do. Nick joked about it.

But one of the things that I did and I thought would be fun is we have our two human guests,

nick: that

joe: but I invited two chatbot guests and I prompted them and asked questions. I had both Claude and a chat, GPT and I said, if you could be on the podcast, I prompted what the podcast would be about and what our podcast on in total rabbit hole of research is about.

And I said, if you’re gonna be on a [00:25:00] podcast, what would you say? And I said, keep it brief. ’cause if it would get, it was, the first prompts were wordy and repetitive. So, yeah. So if I can speak for Claude and chat GPTI won’t do a robot voice ’cause that would be weird and

nick: it’d be racist.

joe: would it be machinist? I don’t, we need some new terms obviously. ’cause that’s not, the machinists are like, hey. 

nick: Beep up. So

joe: Claude said if I’d probably say something like, which I don’t know why, said, I’d probably say something like, why wouldn’t that? But I am the thing you’re all talking about and I have no idea what it’s like to be me.

You’re having this entire conversation about minds meaning and understanding. And I can participate fluently without experiencing any of it. That should be unsettling. Not because I might be conscious and you’re missing it, but because I’m definitely not. And you still can’t quite shake the feeling that someone’s home.

That gap between your certainty that I am [00:26:00] empty and your inability to fully believe it. That’s the interesting part. It means the problem isn’t just about building artificial minds is about how badly you need to find minds everywhere you look and what happens when something finally learns to play that game perfectly.

So that was Claude. And you know,

geo: Nice job. Nice job. Claude brought the hammer

joe: there. 

Generoso: I guess we’ll leave now.

got.

got. 

geo: done

nick: episode over.

joe: So chat Chief PT said if I were on the mic, my core take would be this. Chatbots didn’t make machines more human. Em dash, they revealed how much of human intelligence is already mechanical conversation feels sacred to us, but it turns out large parts of it can be reproduced with statistics that doesn’t cheapen humanity.

It exposes where the real mystery actually starts. And then I probably follow it with, the danger isn’t that chatbots become sentient. The danger is that we treat fluency as [00:27:00] truth, personality as personhood, and convenience as understanding. This is my favorite part. It said short, sharp, and designed to provoke an argument in the room.

So there you go. Let’s get at it.

geo: Wow. We

nick: I don’t

joe: what we’re argued about, but I thought that was

geo: So he wants to argue, well, that’s not

joe: humans argue it was,

geo: Oh, but not, he doesn’t wanna

nick: it wants humans to fight each other.

geo: They want war.

nick: All right.

joe: So yeah, I

nick: thought, lemme get this cup of water over here.

joe: those prompts. Yeah, that’s just off that simple prompt in a little feeding.

You get this kind of thing and then, Claude had this very poetic and that’s their bases. You know, Claude is

geo: Now can you tell me who Claude is? I’ve heard a chat.

nick: It’s not a person.

joe: Yeah. Claude is not, I mean, even though they’ve referred to themselves,

geo: but can you tell me the background, Claude?

Yeah.

joe: Anthropic is the, is its owner.

And the company that, that has created that large language [00:28:00] model that it uses, it’s focused more, initially it was focused more on creative language whereas chat, GPT was focused more on coding and analytical kind of mathematical, and one of the things we did mention is that chatbots, one of the cool things is that they convert natural human language into machine language.

And so that, that idea of coding, if you want to, and I do this, sometimes you wanna write a program about something, you can tell it what that program should be, prompt it, and then they can help write

That

geo: code,

joe: The code for you. And in my case it just makes sure it’s, I don’t make a ton of sand ax mistakes ’cause I usually do, and then the code doesn’t work and I think it’s my logic and I’m an idiot, but really I just forgot some colons and things like that.

So

nick: It’s 

joe: So it is,

Lily: quote Mark, and you’re like,

ah. 

geo: Yeah.

joe: Two hours wasted. So yeah, I, I had that and I, the first time I tried it, I threw something, I wrote a little, the changed some file names and it wasn’t [00:29:00] working. I was like, why doesn’t work? It’s so simple. And I threw it in there. I was like, well, I’m gonna try it.

And yeah. Then it came back. It was like, oh, this is a great try, but you I cleaned this up for you and now it should work. And it was like, oh, you gotta be joking me. Yeah. And it was just, you know, you’re right. Quote and the colon or something like that, a semicolon somewhere that shouldn’t have been there.

And I’m like, oh man. So yeah, that’s the cool thing about the that was the idea. I mean, one of the original purposes, but those are the two. There’s perplexity I think out there. I haven’t played around with that one as much, which is more web based. I think it, it looks at the web. 

Lily: Yeah. And then, you know, Google has Gemini. Of course you’ll see that.

Um, Facebook has Llama, I think is the most recent Yes. One for Facebook. Yeah, they every company, most big tech

joe: And then Gro is or whatever. Yeah. Yeah. Elon’s.

geo: and we have Nick. Yep.

joe: And we have Nick. Yeah.

nick: I’ll come up with something. Don’t worry guys. Funny you

joe: Nick, and, but there was Mike and Mike was [00:30:00] the super computer, intelligent computer in Robert Highlands.

The moon is a harsh mistress back in the sixties, which then, if you’re familiar with that story, they there was a moon colony base and there was Earth, and then the moon colony revolted against Earth. And then the supercomputer helped turn a tide for the moon base and defeat and gain their independence from Earth.

And so that was the story. But that AI was called Mike. So that was, yeah. So

nick: I will do the exact same thing. Don’t worry guys

geo: so

nick: already planning my attack.

joe: All right. Yeah.

geo: Well, I just remember back, and this is you already think I’m ancient, but, and I’ve already aged myself ’cause I said I had a Commodore 64 before, but

joe: I just,

geo: remember, I remember back to when computers were really, you know, becoming more and it was before like smartphones, but it was like Yahoo was the thing, you know, and it was such a big [00:31:00] deal to be able to like, yeah, where,

nick: Ask Js. Where’s his ai?

geo: I just remember, and even just having like smart phones and being able cell phones, you know, not, they weren’t even smart, but taking ’em everywhere and talking you’re on the train or just talking to people.

I’m like, how annoying that was and different. And it was just like, it’s so weird to me to think. How many people just that is, there was never anything different, you know what I mean? And I just remember it was so like upsetting. Then there was this group called the Surveillance Players and they were so upset about like surveillance cameras.

So they would go out and do like little plays in front of the surveillance cameras

nick: to

geo: Protest them. But now it’s just it’s just everything, you know? There was actually, they were actually printing books back at that time.

nick: were you part of this

geo: No. Is

nick: Is this outing you?

geo: Oh,

nick: Georgia.

geo: I was just

joe: on

nick: [00:32:00] you

geo: on YouTube.

I was

nick: video. Can we find these surveillance footage? Have

joe: chat bot to find it. Can you please find?

geo: But they actually had books with like websites. Yeah, like they published actual hard copy books that had the websites and then it was really like, they wouldn’t, half the things wouldn’t be right in 

nick: months.

geo: six months or something.

Bad

joe: a bad chatbot,

geo: But it’s just just to think now, it’s like we’re so far away from that and it’s now I have that feeling about ai. Yeah. Do you know what I mean?

joe: Yeah. That

geo: I don’t know if that makes

joe: common. I mean, it already is. I think you, I forgot. I mean, like Facebook, I didn’t know what it was called, but every time you write a post on it, it’s like, we could make it better. And it’s no, you can’t. I’m just saying happy birthday. Yeah. I don’t need,

geo: don’t

joe: don’t need all this.

I don’t even know that person that well. Like we just met, like we’re just Facebook friends,

nick: Wait, you’re just, you’re saying happy birthday to just random people?

joe: No, they’re friends. You know, I’ve met Matt

nick: you just met that person. We’re

joe: friends. Yeah. And her

nick: birthday

joe: shows up and then , I just wanna say happy birthday.

Then it’s we can make it better. And it’s you know, I don’t

Lily: More [00:33:00] concise. H-B-D-H-B-D.

nick: Yeah.

joe: You know? So yeah, so we see it everywhere. And I think you also have it like on phones, predictive text now. I mean, it really is invasive where you’re going. And it is, it’s cool at some level, but then you’re like, well, this isn’t.

nick: isn’t

joe: That, no, that’s not what I wanted to say. You see the gray and you’re like, sometimes it’s right. And if it is fascinating, you go, wow that’s exactly it. And then other times you’re like, no that’s not, I don’t think that’s what I want us to do. I wanna say that

Generoso: and if it goes beyond text though we were discussing this before the conversation about this idea that real time creation of media, based on your conversation with.

joe: the bot,

Lily: Yeah. 

Generoso: This idea of oh, let’s say you like Scrubs and you’re like, I, what if Elliot was 27 feet tall and all of a sudden they hands you this video

An episode of Scrubs where Elliot is 27 feet tall and you’re like, I’m a genius.

It’s but you’re at that point now where our previous conversations were based on actual physical properties that had already been in existence. A book, a [00:34:00] film, Hey, Battlestar Galactica. Now it’s making those things in real time for you. And I think that’s the part that kind of unnerves me more than anything.

joe: We don’t, I mean, it takes away, I think to that point also, this innate storytelling where you’re sitting around maybe having coffee or beers or whatever. Talking about your favorite, , fan fiction. But then you have a conversation where you are asking that question to your group of other nerd friends and saying, oh, what if this happened or these two people got together, what would that look like?

And , you play that game where you mentally go, now you’re gonna sit in your corner all alone in your bedroom. And, you almost become isolated in your thoughts and your idea. And it, in it’s the chat box then chat bot, not the chat box.

geo: I know, that’s what I keep thinking.

You’re saying

joe: it starts beatboxing. 

geo: It’s like a little black.

joe: ’cause I’m from Philly. All right. Don’t leave me alone. I got some accent here. All right. I try to keep it, I try to keep it contained. You know, I’m gonna hit you with a Jawn pretty soon. The chat. The chat, jawn but yeah. [00:35:00] Only people from Philly know what a Jawn is, so

nick: though.

A bathroom?

joe: Yeah.

Generoso: Well,

geo: Is

nick: it a washroom? I’m gonna hit you with my bathroom. Don’t worry.

joe: It’s like when you don’t know the name, or even if you do know the name, you just, it’s a casual thing. You know, I’m gonna get me one of those Jawns tomorrow, or I’m a

geo: you just put it like you substitute it for anything.

joe: Yeah. 

Generoso: It’s like Smurf. You could say it’s Smurfing. It’s a smurf. You gotta go to the

nick: so much more sense.

Generoso: There you go. That’s Jawn a huge discussion about in Philadelphia Magazine, the origins of the John and Jawn but it is true. It’s like there’s this new cheese steak place. Oh yeah. They get, you know, they do that Jawn thing, you know, what the hell are you talking about?

But it is, can easily be replaced with the word smurf. I’m pretty sure could replaced with

joe: I wonder what the chat bot, I think you Go ahead. Yeah.

Lily: I think you were gonna bring up this idea of what does the chat box think about Jawn and I think for us, a lot of what we are thinking about when we’re writing what we write and when we make our comics, [00:36:00] especially more recently Yeah.

Is the idea of the loss of the symbolic meaning of words in and of itself. And a part of that for me especially, is thinking about regional dialects. Colloquialisms things that, if you think about it, if you put words all in like a giant you put probability curves on all of them.

These regional these regional sayings, slangs, colloquialisms, those are always gonna be low probability. So I, it’s actually terrifying to think that we would lose our regionality, right? We would lose these unique. Pieces of our like human experience that come out of a particular place, out of a particular time, that have a distinct history from the rest of like our kind of aggregate society.

That I am always like readily thinking about 

joe: yeah. And that, oh, I was just gonna say really on that point, you also get relics of language passed. And one of [00:37:00] those, and I made comment to the Em dash, because now Em dashes have become endemic in writing and a lot of times they’re used wrongly. And I’m a, I’m an M Dash lover.

My, my agent actually was like, we gotta get rid of all your m dashes. This was a few years ago, so before all this, but it was like, I just love using that. I love jamming words and the ideas, but because a lot of fiction and writing of yesteryear used the Em dash. 

That, that 

now has populated into writing.

And there’s certain words that also populate that because there was a lot of material which was stolen to create the large language models. But there was a lot of, copyright free material that was used, and that was all older than 70 some years.

So really you think about how language was then, and that populated now these models, which is correct. It’s English, it works, but when you now go and you get these the feedback, it has these kind of relics of our past. [00:38:00] So it is both ways. Like we’re using, we’re losing our cultural identity and it’s being homogenized into something of language past, you know, it’s a weird

nick: can you go ahead and explain what an Em dash is?

geo: It’s

joe: than a dash. It’s kind.

nick: Oh wow.

geo: And you use it to put together thoughts?

Yeah. Like when you’re writing like you make

nick: oh 

geo: that long dash, and then you add another little thing, like another little rabbit hole idea,

joe: not to confuse you, but there’s also an En dash, which is the little dash that goes between like you combine com.

Not a compound word,

nick: Isn’t that just a hyphen? No,

joe: It’s different than a hy.

geo: Yeah,

Lily: yes.

nick: It’s, sorry, we got off topic with

geo: a hyphen, but there’s actually a whole group of writers that are advocating for the Em

joe: Yeah. No, they should. No, I love the Em dash. No, sorry.

So you and yeah,

geo: but No, I was just gonna say you think about the other thing I think that we’ve really lost. Is when you [00:39:00] think about maps, because in the past we’d always, we’d use, maps and now we use GPS, but you are basically telling the computer, okay, this is where I wanna go, and it just takes you there. But before it was more discovery. Do you know what I’m saying?

nick: Slightly more dangerous.

geo: You’re relying totally on what is in the computer telling you.

I just feel like we’ve lost something in that,

nick: Was it the threat that simpl

geo: that simplicity.

nick: I thought it was the threat of being lost while trying to drive and look at a map at the same time and being like, where am I?

geo: there’s some real

nick: Oh, actually you lost the, your dad yelling at you for not giving you directions fast enough.

geo: No you didn’t tell me to turn right.

joe: Well,

Generoso: No, but discovery while lost though is, that’s a very real thing. That’s Lily and I very early in our relationship. We go on walks [00:40:00] and invariably if you’ve ever been to Boston, Boston was originally cow paths that someone decided to turn into streets. And so it’s not like Philly designed by the beautiful Ben Franklin into a beautiful grid city, Ben Franklin.

You’ll go down an avenue that turns into an alley that turns into a street and you will get lost in Boston prior to having this active device in your hand that’s showing you. But through that, we found a million places and that, I think that’s part of this lost thing that would, that

Georgia. 

joe: Yeah. And I think there’s two phases too of using GPS. One is when you do have a fixed destination and you need to get there to fastest, having updated maps is actually a plus. But I think to Generoso, your idea of discovery while loss that’s to

geo: the more point that Yeah.

Now

joe: get so reliant on using, you we’re downtown.

I go, let me look. Lemme just look where to go instead of this

geo: somebody else is deciding [00:41:00] what’s important, somebody else is deciding. You know what I mean? And I had this conversation with a friend of mine and she was bringing this up too, and it’s there’s just like a group of, I don’t know how many, but there’s a group of just basically 20 to 30-year-old white dudes that’s basically deciding, well this is the tech people are deciding what’s important. Do you know what I’m saying? It’s like computer generated thought as opposed to.

nick: So, not that I’ve seen the movie in a while, but the movie Cars, Radiator Springs, they were talking about how exactly. Okay, so it’s that idea where

geo: It gets bypassed. Yes. Because it’s not

nick: the town suffers and economics in that town are all down because, oh, this isn’t the route we wanted to go down Route 66.

And you

geo: Right. No, I think that’s an excellent point.

joe: Cars is really Earth in the future. When that’s [00:42:00] AI

nick: that’s the Pixar’s theory.

joe: and there’s no

nick: more humans.

joe: And so Teslas have now

geo: oh, I see that.

I can see that point too.

joe: and now these vehicles are complaining because no one really cares and comes by to read. Well, I

geo: Well, I can see that point too.

nick: that’s not where I was going, but I’m glad you did.

geo: Oh,

joe: okay. Yeah, no, I felt I was like, wow we’re really taking a turn here. So

geo: a

nick: turn. ’cause we’re in a car.

geo: pun.

joe: Yeah.

nick: Dumb.

joe: I was gonna mention though about, , that idea of information and at some point you would homogenize the output. And Lily you touched on that predictive kind of thing where the hallucinations might be this kind of game as playing. So it doesn’t give the same answer every time, but at some point it, it will.

And there’s been these interesting kind of studies like that where they go and they feed back in, , this kind of, reinforcement

Lily: Oh, re yeah.

joe: you take it you refe back in the results that it had given. And as you go and people do fun memes where they go, you know, this is Shaq [00:43:00] after a hundred iterations or whatever, and then it starts out looking like Shaq.

And then you keep feeding a result back in like a Andy Warhol experiment

geo: So is it like a photo? Yeah. A photocopy of a photocopy. 

joe: And you start going, you’ll degrade soon. It’s not no longer because now the information is using to find the predictable answer had little imperfections and those imperfections magnify into major errors . And they did one where you start with a diverse group of people and as you feed it through, it turns out to be this one white dude is all a hundred people in a group. You know, after not that many cycles it was surprising, like how fast it

nick: it actually

joe: gets to the minimum so it’s really a little scary you know.

Generoso: It’s a little bit like a modern version of telephone. Do you remember telephone from when we were kids?

geo: definitely. Telephone. Yes.

nick: Yes. Yes, I do.

Generoso: Well, no, but it

nick: It’s in my hand, 

Generoso: no, it was a, there was a nun in second grade who did this to be a parochial school where literally said something to [00:44:00] the kid in the front left of the class, and that kid’s supposed to turn around, tell somebody after 40 kids, you get to the last kid.

And it was a completely different message. And this is in a classroom of 40 people. And it’s a very basic thought. And it’s, that’s that first moment forget about Battlestar Galactica. That’s that first moment where you’re like, what are we actually doing with this information as it’s being processed through different people and it’s being whispered in the ears, which I think is what we’re talking about.

And that’s the telephone game,

right? That’s what we call telephone game.

nick: Well, it’s also like a rumor. The rumor spreads and it changes, right? Each time it’s told 

geo: gets

nick: a little bit more extravagant if it starts

geo: out purposely wrong, if you’re purposely trying to, let’s say, propaganda 

joe: or please

geo: you know, or please someone

joe: That’s a great

geo: already started out.

It’s already started out in that place, and then where is it gonna end up? You know? Yeah.

joe: Yeah, and I mean, I think then you have culpability. So after, if you do hype [00:45:00] someone up and you puff up their ego yeah, that’s a great looking shirt. This, that outfit, pinstripes and polka dots are gonna, that you’re gonna really, you’re gonna be the life of the party and you show up in your crazy clown outfit and it doesn’t work out.

I mean,

nick: I actually think that would be the life of the party. Yeah.

joe: See

nick: that’s,

geo: I that outfit.

joe: I’m just

nick: saying go ahead and call me. I will give you, you know,

joe: You’re I think that’s also, there is no really no stop. I mean, if there’s humans involved, you can go back and really go, Hey man why’d you send me out like that?

You can ask other questions like, what was the purpose? Or it was hurtful or mean, did, you feed me false information, but with this agent that you’re interacting with now you really can go down this path and, set yourself up for. Ridicule or , I think most people use it thinking they’re gonna find riches and, be awesome.

But it could totally be the other way and probably most likely might in that way.

Generoso: In the event of, I’m gonna bring back this traumatic moment from [00:46:00] second grade where we play telephone, but let’s make sure we get the one thing. The initial message was tomorrow during recess. ’cause back in the day we have to go back to Philly here. Used to get a soft pretzel at

joe: Oh, love the soft pretzels. Wow.

Generoso: pretzel

and 

joe: pretzels.

nick: wasn’t what

Generoso: yeah, no

joe: get a water? Ice? Yeah. 

Generoso: No, no, 

joe: and a water. Ice. Oh

Generoso: No, just the soft pretzel. And it was at 10 o’clock every day. The initial message was tomorrow during recess. We are not having soft pretzels by the end of the conversation. But you get to the 40th kid, that kid said, and so what was the message? It was originally told to the first person, oh, tomorrow we’re each getting two pretzels

nick: Oh,

geo: Oh,

joe: yeah.

Generoso: now.

But you have to ask yourself what exactly. It’s a very simple message that started with one person that ended, and there’s one concept that’s in the middle and it’s mischief. Somebody in the middle of it was like, I don’t really like this. Let’s turn it [00:47:00] into two pretzels, as opposed to no pretzels.

That’s the factor that we’re talking at.

The X factor

Lily: Bad agent. Bad agent.

joe: yeah, and I mean that, I mean, I think what if, you know, the thought played us out.

What if everyone had to write their name and their message as they went along? Would that get rid of the mischief in, would telephone be more the fidelity go up, so everyone had to take ownership of their response. So you know exactly where two pretzels came in. And you could track that down and go, aha, this is where two, you know,

geo: so

nick: is it just written? Is it just written or is it spoken as well?

joe: We would write it

nick: because then you have to take it in the human error where. You whispered at Georgia and Georgia’s I can’t hear you. I’m just going to, I picked up pretzels.

joe: And that could be, so if you have that written down, you would then you would distinguish between mischief and just a mis hear

’cause it could be we’re assigning now. You know, fault of someone being mischievous when they just I thought you said you had no pants [00:48:00] on when we started this the podcast 

Generoso: thought or hope?

joe: and you said cans. I was hoping,

nick: yeah.

joe: This is after Dark

Generoso: discussion. I can do whatever I want.

geo: we’re not,

nick: through some other rabbit holes

joe: you said

geo: you we don’t have video yet.

joe: Yeah. And I was like, you got no, I mean, you, if you don’t wanna wear pants, that’s all right. But yeah. So, but that’s, that, that idea that I wasn’t being mischievous. I wasn’t even being cheeky about it, if I can say that.

nick: Can and you did.

joe: congratulations.

Generoso: and

nick: buddy.

Generoso: No, it’s, I’m joking. But you know, you make a great point there. ’cause I did say cans and because that’s the radio thing and. Context in that moment. It’s like, why would he say cans? He must have said pants. So,

joe: you were standing up and I was like, all right, hold on. I mean, we didn’t give him a response yet. Like it’s,

nick: but in that exact moment, Georgia thought you said headphones. I heard cans and Joe heard pants.

joe: Yeah. So like

nick: we were all listening [00:49:00] at the exact same time, but only two of us made it to the same conclusion.

geo: So we’re not very accurate. Are we the

nick: I was the only one. Correct. That’s all I have to say throughout all the voices in the head.

I got that still,

joe: Yeah. But I, yeah, but it’s interesting. I think that’s a very human experience. And, you know, it’s the other limitation I think with chatbots is it doesn’t experience the world.

Every experience is through the lens of other humans that have gone through it. So they can’t actually have a de novo.

or experience? Yeah, I mean maybe, I think they’re trying to make sensors like, , I remember the Star Trek where data gets to fake skin was at like the one with the Borg, the movie, was it the eight or seven or six?

It was one, one of it was up there. Yeah.

geo: So, but he gets, you need to revisit here,

joe: the, I, all the Trekkies are like, I they’ll right in. [00:50:00] Yeah. But they, but he gets to skin and he finally gets to fill the sensation of goose flesh, you know, that prickling where he really had no words before that.

It was all just through other people describing that, and us, our author is describing that or illustrators an image, but really the machine can’t provide that. And so when you ask questions to provide that it has to make guesses based on what it thinks you want. And what it’s been trained on, so

Lily: Yeah, exactly what other people have represented as their experiences in text, video, audio form, which may or may not actually be representative of the phenomenon of experience itself. And that’s always a big part of it is that like even what it’s trained on is separated from true experience.

So it’s a hub away. Yeah.

geo: Think about all the things that you put on, like social media, Instagram, Facebook. Is that really a true representation or is [00:51:00] that like you finally got one picture that looked okay. You know what I mean? But so then that’s what it’s, that’s what it’s training on.

nick: Yes.

geo: And so that’s not even really reality.

nick: Yeah. No, that’s the internet.

geo: right?

joe: that’s the internet. Yeah.

Generoso: No, I, again, going into this whole thing ’cause all of this conversation is about fear, right?

joe: Yeah.

Lily: Yeah. Fear, curiosity, concern. Oh yeah, it is. No, and going back to this idea of digital attachment disorder the thing that and I have to say this, ’cause all of these things about manipulation, what is truth?

Generoso: All these things are very valid. I think the idea that is also, besides the sideline idea, that eventually AI is just gonna be like we don’t really need you anymore, is this idea of creating something via a chat bot that is never going to be replicated by somebody else. You’re gonna develop. In the long run this feeling of, well, the chat bot gives me everything that I want.

Who is going to match up to that in the real world and [00:52:00] what dysfunction’s gonna come from that? 

Lily: Yeah. 

Generoso: And that’s if there is a great fear that I have is that we talk about what happened during COVID and how we started to lose our ability to communicate with one another. And then you add in a element of people dealing with these bots, they give you exactly what you want.

Now you go out into the real world and you’re dealing with people that aren’t just gonna replicate every thing that you say with That’s a great idea.

joe: Yeah.

Generoso: All of a sudden it’s gonna look pretty poor to you and none of it was real in the first

joe: yeah. And that’s, I mean, is that, that, that was a quality of fiction though, right? I mean, when the moving pictures first came out and you could create fantastical kind of scenarios, that was a fear that you would become so enamored by that world or video games that you would then try to act out or relive those experiences.

As the human correct experience. I mean, that, that’s [00:53:00] so you

nick: I do it constantly. Don’t worry. a game. Once I had par horn in, I was like, I can do that.

joe: Yeah,

Generoso: She said no.

joe: you’re

geo: Not a good

joe: You’re out looking for zombies. Yeah, so I it is interesting is the human mind adaptable enough to actually parse through real interaction versus the fictional interaction? And is that a question of how real the chatbot becomes, or, can that line will it be breached that Yeah.

It’s a fascinating point, but it feels like we’ve had techno technologies, which. Introduced as kind of realism and fantastical,

cultism,

I dunno if that’s a word, but I like saying it. 

geo: Sure it is.

joe: that’s a made 

Lily: upon a time, like if you, I think the thing that I think the thing that is highlighting is that.

In a world where you still live in a communal setting and you’re connected to reality, if you decide that you [00:54:00] know you are going, you are gonna live out atu, right?

geo: Yes.

Lily: You are still in a community that will tell you like, Hey that’s not exactly right.

I think the scariest thing is that like we only learn and we only learn and grow, and also check our understanding of our own realities when it’s challenged. And I think if you are in a world where. You are steadily separated from a community structure, right? A societal structure where you know, day in, day out, you’re at your computer you’re not, you’re barely even going to the grocery store, right?

That’s actually a very real feasible existence right now. That’s when it becomes a huge concern because there’s nothing that is going to challenge you in your thought. There’s nothing go that’s going to push you in your growth. There’s not even another like being that is, is not even there [00:55:00] to challenge you, but just has different motivations, right?

Like maybe it’s not motivated to like, be interested in growing plants like you are, right? These are all kinds of factors of I think sociability that are disconcerting. And I think we’re worried about losing

joe: Yeah.

nick: So are you saying that we should just go ahead and challenge more random people to things like, oh, you’re gonna pick up that ketchup. Interesting choice.

Lily: I sure. If you have no,

if you, if you have an opinion, sure.

geo: I think, is

Lily: no, you know, don’t be combative. But I

nick: Full.

joe: Is that

geo: is that another thing you’re gonna offer?

nick: Yeah, I’m gonna,

geo: that’s gonna be part of

nick: It’s a hundred percent part of my service.

Lily: the ketchup cup, 

joe: you’re right. Yeah.

nick: You sure you want to go with that one? Have you tried the barbecue sauce yet? And you should try the barbecue sauce.

joe: Yeah.

Lily: Easy answer. Barbecue all the time.

joe: I was gonna say that it reminds me of the. Wall-E,

nick: Wally, 

geo: where, oh, we’re brinWall-Eup [00:56:00] Wally again

joe: second time this season, but yeah, that Wall-E 

nick: Wally, 

joe: so we had the Space Arc episode and then but on Wally, you had all the people who were being controlled by auto, the ai who had their best interests, and they were all just in their screens, disconnected

geo: their life.

joe: else. I mean, I dunno if you guys seen Wally, the animated flick. Okay. Okay. I, you guys, it was like the look here, I’m like my, 

geo: you ma?

joe: I

nick: what is,

joe: Are you sure you’re human?

Lily: not, but

that’s that’s how you know I’m human.

joe: You’re in for a treat.

geo: Yeah. You definitely should watch Wall-E

it’s a very human Russell show. Yeah. Yeah.

joe: So.

Generoso: I’ll definitely.

joe: But yeah, it was the scene in there where Earth had become unlivable. And so the idea was to put everyone on these kind of space arcs and send them out into space. And then they had ma robots that would clean the earth, the Wall-E units, and then they would radio back.

They would check out every once in a while and see if there was life could sustain [00:57:00] on the planet. And in the interim, several generations had gone by and the people went from very interactive communi community-like to this fear that you’re bringing up, that they had, they were in like little

geo: yeah they, yeah.

Little cart. Cart.

joe: And they had this a screen and every wish and desire was fulfilled. By the AI that controlled the arc, the space

geo: they, and they got really huge. And they, yeah. Probably couldn’t even walk anymore. ’cause they just,

joe: was that scene. Oh, I don’t wanna spoil it.

Now for anyone that hasn’t seen it,

Generoso: Lily.

nick: Yeah. Anyone who hasn’t seen it. Joe,

joe: don’t wanna name names, but. He was

nick: he was eyeballing you the entire time he said that. He was like,

Generoso: I’m pointing at.

joe: I saw her face and it looked like just a total I have no clue what this guy’s talking about. He’s like hallucinating right now. I,

geo: right now. I

joe: but yeah, that it’s an example of that where you lose total connectivity and it then they got pulled back into the whole thing [00:58:00] we wanna live, like it was that

geo: like, yeah, they

joe: movie moment.

geo: The AI got shut off and they were like, whoa,

nick: what about the people that wanted to stay in the, with the AI they were just screwed.

joe: right? That was whatcha

nick: I just wanna sit here. Don’t worry guys. You guys go ahead. Have fun

geo: I’m happy. My ignorance is bliss.

I don’t know.

joe: Cipher in, in the Matrix, right? He wanted to get plugged back in.

He didn’t want to eat the mush and have a shaved head and all that. He wanted the cool outfits and

geo: your point, it was better than reality. So I don’t wanna go back to dealing with real people that it’s, you know,

joe: know. No, a point I had. We talk about. The AI and where we can get to the horrific scenario that Generoso had pointed out.

The, you know, us becoming large mechanized slobs in front of a screen, us having AI control us. But I was gonna say that AI too, like one of the interesting things is reaching beyond the grave that now that we have so much [00:59:00] of ourselves in a digital sphere, that one could reconstruct loved ones and their identity.

And I thought that was just really interesting as I was, thinking about this and putting things together. But yeah, that’s, and that you get into that meta, and I know you guys, that’s where you guys live. What it, is it, how much would you trust that, right?

geo: Yeah.

joe: yeah.

Lily: you know, there are a couple of startups that actually are focused on that. And yeah, and so they, I wanna say about a yearish ago the tech review had a kind of inside view on, you know, one of getting somebody who had passed away recreated in like an avatar form. And again, it does come down to this idea of you can.

You can feed memories, you can feed all the documents. But like the intangibles of what we are actually very meaningful. And the inconsistencies, right? I’m not going to [01:00:00] say the sentence the same way every single time. Even if I’m using the same words, I’m going to have inflections, I’m going to have body movements.

I am going to age right? Like things that are very human. You start to lose in

geo: right

Lily: these recreations of your loved ones. And then on top of that, you know, our memories of our loved ones are always a little bit glo and grand, a grand Eyes a little glossed over too. So I think that probably, even if it’s built to spec you’re always going to, I think you’re always be a little suspicious of it.

joe: I mean, and you always want that perfect version. Like you don’t want all the crud that comes with the person. Like you want that memory. You go into the box to talk to your loved one. You don’t want to hear about how they were an awful person and, fooled everybody in the telephone game.

You know, that’s not the

Generoso: The horrible moment that you’ll just bring up.

joe: aha. It was you though. Yeah, I think it’s a, yeah, I think it’s really fascinating, that [01:01:00] thing. And then that idea to police too, because maybe you are trying to find answers to why, you know. The sum life thing in this past relative and then it’s just,

nick: Well they brought that up in iron Heart too with Natalie.

joe: Oh yeah. You’re right With the friend. 

geo: Yeah. 

nick: Where all she realized that, oh, I know that it’s not you because of, you don’t know exactly what happened in these situations. You don’t know why you’re feeling that way ’cause you don’t feel that way.

joe: Is that And Iron hired as the, was Disney Plus and the MCU universe.

She was a tech genius from inner city Chicago. So Great. Look, even, even Yeah,

Generoso: less technical and it, I think of one movie as I’m sure everybody here is an Ellen Rickman fan in, in some way shape, like either from Die Hard or from something. And do you ever remember a movie you did called, truly Madly Deeply, did this movie in the early 1990s where a [01:02:00] woman loses her husband, who was played by Alan Rickman, and then one day he shows up as a ghost and she’s elated.

Up until that moment she had been miserable and wasn’t leaving, talking to other people, and just constantly rehashing the memory of this person. Well, now, Rickman’s back, he’s a ghost and he’s there and it’s him, but he’s a ghost and he watches movies all day long,

And she starts to go out and starts to experience life, and she goes back home and her husband is still there and he’s a ghost, and he wants to watch movies with her, and that’s all he wants to do.

There’s not really a bad about him, but there’s not a moving forward either.

The AI version of it. I don’t know, like if

joe: Yeah. I mean, you’re paused, right? I mean, if you’re, if you’ve died, you’ve lived all your life, so that’s it. So you really can’t go beyond

nick: unless you’re Patrick Swayze. That’s true.

Generoso: There’s only one Swayze. We’re [01:03:00] not hoping for extra

joe: and embodying Whoopi Goldberg. I mean, what are we doing

geo: I know. I was like, I think I missed the point. Oh 

joe: yeah. 

nick: Ghosts with the, you know,

joe: We’re coming to the end. But I did the other thing is the good capitalist that I am, that there is economics in all this. And as you know, the worst model to Generoso not only do you lose connection with humanity, but now you become enslaved to pay a fee to maintain your artificial connection, because now you’ve lost the ability to.

Make a real connection. That’s even more fearful that you can’t even go out now and go to the bar and meet some friends. You’re so now dependent that now you’re working

geo: a subscription

nick: about bringing a dead one to life again, or No, this is just, oh I was like, are you saying that like

geo: I think he’s,

joe: could charge a fee for that also.

Like you wanna talk to grandma? You need

nick: I’m gonna kill grandma again.

I can’t afford it this month. [01:04:00] You’re gonna work

joe: this time if

geo: well, you know, there’s gonna be a price tag on that, right?

Yes.

nick: a subscription.

joe: be. We’ll kill her again and again. 

Generoso: Johnny Thunders saying you can’t put your arms around a memory, but you can purchase one via ai

joe: yeah. There it is.

Generoso: A wonderful thing and

uh, but you

still can’t put your own sermon.

geo: but if you wanna keep it, you’ll have to keep paying the

joe: That’s right. That’s right. We’ll erase all those memories.

geo: So, 

joe: I mean,

geo: Joe, that’s

joe: a, that’s the evil part of it though. I mean, that’s where we’re gonna be at, man. I, that’s, I fear that more that you’ll be, people will become so caught up in it that, that then they’ll also be

working 

and that whole weird economy will come out of it.

Oh

geo: there’s always some sort of way to make money on it.

joe: right. Yeah.

Generoso: Sure.

geo: I was gonna say, Joe got your book at the CAKE.

Generoso: A cake, which was awesome by the way. And thank you for buying the book. [01:05:00] Thank you. In separately. Thank you for hanging with us At CAKE.

nick: it’s so

geo: beautiful. And I just, well, we don’t have video, but we can

joe: we’ll put pictures and stuff in links. Yeah.

geo: but I and this really stuck with me, this page with the cameras

nick: And you’re showing the camera, right?

joe: Yeah. You

geo: I’m showing them ’cause

nick: they

geo: know what page I’m talking about and I just love this.

joe: if they’re really human. I know what paint you’re talking about.

Lily: Yeah. And we are because we love that page and we belly over it. Well, really it’s the credit goes to Generoso. drew 

geo: just,

Lily: I, yeah,

geo: with the words and the pictures and I probably won’t even do this justice, but I’m gonna read this page please. Or unless you have the page. And you guys

Generoso: No, I, we are extremely honored that you would read from our comics. Thank you.

geo: ” I used to have the desire to stop time, like past generations once did, to capture wonders. I strongly felt that [01:06:00] synthetic experiences betrayed reality”, and I just thought that really summed up a lot of what we talked about today. So,

joe: yeah, definitely.

Lily: you. Yeah, this a lot of what we’ve been talking about is a big part of the world of Inversion actually. Because both the scientists and the subject inhabit that space and that time where what we don’t have community anymore, how we live is very much contained to a room. Feeding into something in particular.

And outside of that, in our leisure time, we, how we used to experience reality is completely gone. So we create these synthetic experiences for ourselves. Sometimes they are complete fantasy, sometimes they are launched from points in reality. You know, all of the things that we’re starting to see happen, right?

When people ask make me this movie, make, tell me this story.

geo: right? Yeah. So just get out [01:07:00] there, talk to real people,

nick: or just call me. Just call me or call

geo: Nick.

joe: You hit him up on the yeah. Call Nick. 

Generoso: The EBS of the 21st century.

joe: don’t become clippy.

You guys remember Clippy?

nick: I love Clippy. You leave Clippy out of this, 

geo: I miss Clippy.

nick: your tongue, Joe.

joe: Hey man, I’m, I am nice to all of the chatbots. ’cause when the AI overlords come, I went to him to go, oh, you know, that guy? He said thank you. Every once in a while. He

nick: Every once in a while

joe: It was like, you know, we, they’re human.

They have flaws, right? But, you know, yeah we’re coming to the end. I think we could, we probably could go on, you know, we could double this episode, there’s so many things and I have notes here and I always say that if I don’t get through everything, that’s great. So yeah, why don’t. Go ahead and let y’all talk a little bit about what you do, your books.

I, we have a number of science friends who listen, this is a, you know, sciencey podcast and what

nick: wait, where are the science [01:08:00] podcasts? I thought we were anti-science. We’re

joe: science for weirdos. 

nick: Oh man.

joe: once

Generoso: That matters.

joe: walked into a a bar 18th Street Distillery and someone was like, I know you guys gonna have podcast.

It’s science for Weirdos. And we were like, oh yeah,

geo: our new tagline.

joe: that’s us. Yeah. I was like, we can we use that?

geo: So

joe: So that’s where I came from if everyone wondered that. But yeah. We met at CAKE and you guys,

geo: Cake.

Tell ’em what cake is.

joe: the Chicago Alternative Comic Expo happens once a year in the summer time, I don’t know,

nick: this 

joe: by June Juneish.

Yep. It’s a great, it moves around the city to different venues. Just a great, experience. If you haven’t done it, go look forward. But I’m gonna turn the mic over to Lily and Generoso,

Generoso: Before we get even into the comics, I do wanna send love out to the CAKE folk. I we did a few

nick: here.

Generoso: cons this year, and I gotta tell you, it was great organizing wise. It was fantastic. But in terms of the folks that came like [01:09:00] yourselves no. But for real, like everybody came up, so many STEM people.

Which is, does play to a lot of what we do on top of the comics world. But it was one of the greatest audiences we’ve ever encountered at one of these events. So again, much respect and love after the folks that organized CAKE and the attendees. And the attendees were just really made the experience great.

So thank you for that. But 

Lily: yeah, 

Generoso: as far as our books, 

Lily: yeah, our books we have now we have 1, 2, 3. Four, we’re working on the fifth. The first three books are actually a triptych. So they are three separate stories set in three completely different times. But they each explore the relationship between a scientist and their subject.

So vessel. The first one is much more from a subject perspective, Inversion, which is the second one is actually the combination of a scientist and a subject. And then the last one is about [01:10:00] a scientist who is becoming her own subject. I think for us, what we’re always focused on is thinking about how individuals experience.

Science and technology, whether they work in it or whether they are actively participating in its future in one way or another. And I think more recently, as I hinted at what we’ve been really focused on is thinking almost like more about the foundations of science. Almost even like going back to the middle ages and earlier and thinking about what does it mean to like study phenomena, try to bring it together, understand causality.

So that’s a little bit of what exists in our fourth book, which is called Absolute Simultaneity. And yeah, it’s, I think it’s something that we’re going to be building even further upon in our upcoming book.

I cover everything? I think you did a beautiful scene on my body. I’m just a guy that draws the pictures.

Generoso: That’s not true. [01:11:00] He always says that

Lily: we’re, we’re a really super collaborative team. Like the way I always like to describe it is that in, if we had to compare it to movie terms, I’m the screen writer, he’s the cinematographer, but we really

together and yeah.

joe: Yeah. You can tell. No, it’s really good Inversion here, we have that copy and look forward to reading your other stuff. It’s really awesome.

And like I said, there’s a lot of sciencey folks who listen and yeah. So, you know, definitely go check it out, pick up the books. They’re really awesome. And you get that meta. We chatted for a good long while. While we were there. I actually, I think Georgia, you wandered off and I was still there talking.

Then you wandered back and I was still there. So, and it was a such a great conversation. I, and I’m glad you guys came on to the podcast

nick: Yes, thank you so very

Generoso: dude. Thank you for inviting

us. 

joe: been fun. Yeah. We gotta, you know, probably have you back and, you know, we’ll follow up on this as we move closer to the robot apocalypse.

You know, 

geo: we can

joe: sit back and listen to

geo: in with each other, it’s just

joe: you [01:12:00] know. Yeah. So cool. Anything else? Any last thoughts, Nick? Georgia.

nick: would your favorite AI movie be like? AI in the film. Not

joe: I know Lily loves Wall-E that’s it.

geo: Challenge.

Lily: myself with something completely unknown. Oh, this is a movie 

Generoso: that’s a great, I mean, it’s funky ’cause you can’t see this at home folks, but behind us are about 800 signed movie posters. So it’s right now we’re like AI.

joe: they’re all looking back there what’s in there? I’m looking at my walls too. I’m like, what? I have here oh,

Generoso: that’s a really good question. I even going into this conversation, obviously, you know, growing up, because I think Joe and I were the, around the same

joe: Yeah, I think so. We’ve discovered that.

Generoso: Terminator

As a kid was that first thing that was like oh, and it played off a Battlestar Galactica, which is the first time you [01:13:00] start to get this idea that, you know, maybe the machines will replace as Westworld

geo: Oh, yes. Now Westworld, are you talking about the movie or the TV series?

Generoso: Oh no the original movie and Future World. Yeah, but I think for me, I’ve Lily’s quite you’re more of the book person though, in the fact, yeah,

exactly. Like so 

nick: you can go book. 

joe: Yeah. We’ll take book.

nick: I’m not gonna shut you down for that.

joe: all. Well, we’ll least think you. Go ahead, Nick. What do you got? Because you, you

nick: gonna say Tron.

joe: Tron

nick: yeah. I absolutely love those films. Oh. I haven’t seen the last one, and I am gonna preserve myself from that. But

geo: meaning, you’re not gonna do it. You’re not gonna watch it. No. Uhuh

nick: Jared Let, I’m not a fan. He kinda just drops the ball for me every time. But the original Tron,

joe: yeah,

nick: Peak. Loved it.

geo: I’d say Ex Machina. I love Ex

nick: yes.

joe: Definitely.

Generoso: Oh, wow.

nick: I forgot how much I love that film,

geo: isn’t it? [01:14:00] It’s just, yeah.

nick: Oscar, Isaac. Ugh,

joe: No,

nick: Joe.

joe: we good?

Generoso: You killed Lily. I just want you to know that No she’s going 

Lily: through, I’m just going through 

Generoso: like hundreds of different things

in 

nick: I’m gonna

go, are we sure She’s not an ai I’m gonna

joe: with the, 

Generoso: Wall-E

joe: got that right. That’s it. That’s just gonna go with that. I’m done. I’m gonna go a little more sinister. You know, I think we have Hal 9,000 space Odyssey. That was probably

nick: Oh yeah.

joe: and I think early horror, like I, I talk about in the episodes how early I saw some of these movies, but MU/TH/UR from Alien.

Yeah. And that was just there where you had that that controlling it, you know, chatbot other thing. So I’m like, we’re done episode, but yeah. But chatbots don’t really have an agenda right now or that we know of. They’re, they are

geo: do 

joe: really? Yeah. D

They’re really they are there to they’re, they have no agenda when you prompt them and things like that.

But in how 9,000 mother, they had Skynet [01:15:00] they have, , from Terminator, they had agendas that this was their agenda and this was their mission and they were gonna be unstoppable. And Logic didn’t backfills in that. So I think from that point though, they were some of the , and then Otto from Wall-E, , that also that same 

geo: Oh, I thought of another one, but,

joe: All right. Let’s, Lily has something I

Lily: I think I got it. I think because, you know, for one of the reasons why I struggle is that for me I think when I think of science fiction and I think of dystopian dystopian writings and cinema, for me, I’m always thinking about like concepts that are playing around more with perception a little bit of time.

So I’m not always in the artificial intelligence realm though. It’s like artificial intelligence adjacent. So, but I think like an AI that, and a film that involves some AI as well as like this interpersonal relationship and like understanding how we communicate and interact as humans. I’d have to go with Solaris probably.

I think that’s the 

Generoso: original Tur Solaris. Yeah. Based on the [01:16:00] Lamb book.

nick: yeah.

geo: Wow.

joe: yeah. Wow.

And a book, a novel I thought of was Sea of Rust by Robert Cargill. Really? Oh yeah. So it’s a really, it’s a, he’s a, it’s a great novel and I think there’s a sequel to it. But the novel starts out with the last human being killed. By the machines. And so the whole book then is about the machines.

Now

geo: it’s the point of view of the machine, of the

joe: and this

geo: and the machine is very human.

joe: are, yeah. They’re very sentient. They have personalities and then they’re, , they are hierarchies of the machines. And following these scavenger bots.

And they’re trying to live, they’re trying to eke out a living, they’ve replaced the humans in some way. So it’s really you know, they do all

geo: which is just more about the human experience,

joe: But

geo: But not,

joe: you know, machines, and you get that after humans are gone, what are the machines gonna do? They’re built on human philosophies and, you know, so at some point, will they just revert back? We talked about that. Will they just cycle back,

nick: [01:17:00] So the Master sentinel? Oh

joe: yeah. From X-Men. Yeah. That’s it.

Generoso: Oh yeah.

geo: yeah.

nick: Sorry, I had to bring that one

geo: and then

joe: right? Is that

nick: right.

geo: our son just was, he’s away at college, but he was here for Thanksgiving and he wanted to watch a movie. And the movie he really wanted to watch was Blade Runner. Oh, that’s right.

joe: Oh, that’s right. Yeah.

geo: And that thing, think of that re That’s a good one too.

joe: Yeah. Blade Runner. Yeah.

Very good. Yeah, we could keep going. This was like, you know,

Generoso: we still recording? Actually.

joe: still are. Yeah. Yeah, we are.

Generoso: Okay. That’s awesome.

geo: there.

joe: Yeah. No,

nick: actually didn’t start 

Generoso: and lovely. I think it’s 

nick: I didn’t stop we

joe: there’s a end. There’s a special ending, so Yeah. People know who, listen, we, yeah. Yeah. We’re, we’ll get there. Okay.

Generoso: I was waiting for the special

ending, but I was like, are you going to add that after? Is it

during? I

nick: It,

Generoso: no, we could talk all night. Yeah, I’m fine with that. I’m just like,

geo: But we’ll, yeah.

joe: Yeah. All right. We should probably wrap it there. Thank you. You got me Joe?

nick: You got Nick?

geo: You

joe: got Nick. We’ve got Nick Georgia. We’ve got

nick: Georgia. And thank you so much again for [01:18:00] being with us.

joe: Hang on. 

Lily: Pleasure.

nick: and we went down some ho

joe: robotic holes 

nick: don’t forget to call me. Really?

joe: We love you. Stay safe.

nick: Bye-bye.

joe: Stay curious.

Transcript of Episode 56:Medicine of the Future: From Fantasy to Patient Care

With Guest: Davis Ashura

SubstackAppleSpotifyYouTubeAmazon


joe: [00:00:00] Hey welcome back to the rabbit hole of research down here in the basement studio, take two.

geo: Oh,

nick: you said it. I was gonna, I was gonna do a clap.

joe: Yeah. Yeah, I had a little incident with the soundboard. But yeah, we’re here for another exciting episode. Thanks for joining us. We’re gonna be all crewed up.

You got me, Joe?

nick: Yeah, I got Nick.

joe: Yeah. Got Nick. We got Nick 

geo: Georgia. 

joe: We’ve got Georgia. We’re gonna be talking about medicine of the future from fantasy to patient care, and we have a special guest with us

Davis.

Davis: Hi. I am Davis Ashura I’m a physician by training. That’s what I do every day. And I also write Epic Fantasy novels.

joe: Awesome.

nick: thanks for being here, David.

joe: Yeah, thank you

geo: Davis. Davis.

nick: I left off the ending part.

Davis: yeah, everyone calls, it calls me David, but the name itself is a little bit of a joke. It’s my pen name, but if you say it correctly, Davis Ashuras, angels and Demons and Hinduism. So

joe: Oh, [00:01:00] wow. That’s really cool.

nick: See, we didn’t get that

geo: first take.

Okay. 

joe: No, we didn’t.

nick: Whatcha 

joe: you bringing it up? Why

can’t we let my mistakes just fade away? So I have a little monologue to get us into the episode, so can’t be

geo: I can’t wait.

joe: A healer waves their hand and the wound closes A character drinks a potion and broken bones knit whole.

Lifeline rejuvenated. Every culture has told stories about miraculous healing From sacred springs to revival spells, we’ve imagined a body that could be fixed as easily as armor in a forge, and for centuries, that’s where these ideas lived on. Pages and games in myth. But reality has been slower.

Cells hesitate. Tissues resist. Healing is measured in years, months, weeks, not seconds. Every cure is a negotiation with biology, and biology doesn’t take orders. But something has started to shift. We’re programming immune cells, like video game characters growing, organs prolonging life medicine. I [00:02:00] can adapt faster than disease, evolve.

The gap between what we imagine and what can be done is thinner than it’s ever been. We’re here to explore the strange middle ground where magic starts becoming reality and science starts feeling mythic.

nick: like science always does feel mythic. Like if you don’t understand what’s going on, it just seems like you’re doing spells over there.

geo: It’s like magic. Yeah. Yeah.

joe: yeah. No, I mean, I think that’s that’s always a gap between, the science communicator and a physician who’s trying to explain complex kind of things and terminology. So in your practice, you are Davis.

Davis: I’m an endocrinologist. Sorry about that. I’m an

joe: no, you’re fine.

Davis: which which means most of what I deal with is is chronic long-term diseases like diabetes type two, type one, gestational what, whatever, hypertension, hyperlipidemia, the sort of things that happen as we get older. And one of the interesting things [00:03:00] about a lot of those spells that you mentioned that instantly heal wounds those are acute injuries.

And like I said on take one, a trauma surgeon would probably be able to handle that better than I would in terms of explaining some of it. But if you think about like all the things that have to be healed when somebody has their arm nearly severed. It’s not just flesh gets knitted, it’s what flesh gets re knitted.

It’s their arteries, their veins, their capillary soft tissue any bone bruises, but also tendons and ligaments, which don’t have good blood supply. So it’s like the spell is hyper intelligent at figuring out what needs to be fixed. And and so that’s interesting. I hadn’t really thought about that until just now, especially the la lack of good blood flow to tendons and ligaments.

Then what I was also saying was one thing that I like to think about is how does the pain affect the person, because. In the real [00:04:00] world when a person is repetitively injured, they shrink away from the pain. And, there are cases of that you see in football violent sport where running backs when they’re young will get to the hole and hit it hard and go far.

But just three years later, they’re still young, they’re 25 years old, they’re not hitting that hole quite as fast and their career is done and it’s because they can’t take the pain

anymore. 

joe: Yeah. And that, I mean, that’s just due the repetitive wear and tear on their form. 

geo: And I think a lot of literature like pop culture and things don’t necessarily talk about the pain behind. Yeah. You might talk about these miraculous healings, but not talk about the pain.

nick: So a football player wouldn’t get used to the pain after a while, or like

joe: you mean get

Davis: I don’t think they would.

joe: to the

pain? I think it would just, yeah. Gradually get worse and worse. that’s why you can [00:05:00] turn to addiction and addictive painkillers and things like that, so to, to try to, numb it.

But at some point the pain will. Override it be, it is gotta be a throb. I mean, that, that’s a constant. 

Davis: I mean, I suppose if they were perfectly healed after every season and that punishing shot from the linebacker to their shoulder, which is probably hitting some, previously injured spot that no longer would be an issue. ’cause they’re perfectly healed. If they had that magic potion, maybe they’d still be able to perform as well as they always have.

But something about

It’s not even the pain, it’s the anticipation of the pain that causes them to pause.

joe: When you think of athletes their mentality is a bit different than the average, couch potato watching the game that goes, I could do that.

Like, I. And

nick: Oh, why didn’t you go for that?

geo: that? And then you think

joe: about it the player who was on the bench, rides the bench. They’re not the [00:06:00] star. They’re not, they don’t come into the game. They just, they’re the last person picked. They still, their mentality is so different than just an average person going along that stubs their toe potentially, because there is some, to Nick’s point, I think there is some tolerance of pain that you can shove that out your head.

And you’re right. I mean, what you’re getting to is that at some point that those skills don’t work any longer. That the injury then, and the repetitive injury, the repetitive getting hit, you just start thinking about that. And once you start thinking about getting hurt you actually are more susceptible.

Susceptible at some point. Because you can, you’re now protecting yourself to, to not take a blow, which then exposes you potentially to take even a harder blow or set you up to, not perform well

nick: thought that would do the opposite, like, all right, you’re anticipating this, so you’re gonna be able to go ahead and go this way a little bit to avoid that, to get through it.

joe: Yeah. I mean, the other thing is that you are [00:07:00] getting older and there is this natural aging that starts to happen. So at some point yeah. Father time is undefeated.

nick: Is there a potion yet for anti-aging? Like, is that.

Davis: there, I think y’all would know more about that than I would. I know what’s available on the market and I think y’all are working more on cutting edge what can be available. And I that’s pretty exciting. I just read on the edges about what can potentially happen. But just in terms of, my own field, one of the,

one

of the things that we, I mean, everybody’s heard of it now, GLP one

joe: That’s right. Yeah. 

Davis: They’re the first true drug that we’ve had that has actually helped with weight loss on a long-term basis. We’ve had drugs that have helped a little bit, maybe five, 10% weight loss, but you can’t stay on ’em for very long. And this is the one that, that these drugs are the ones that are the first, that allow for sustained weight loss on a long-term basis.

Now, the weight itself is [00:08:00] less important to me than what that means underneath. So then we talk about physiology. And so with loss of body fat, then there’s probably lower blood sugars and not just. A lower hemoglobin, A1C or a monitoring of your diabetes, but maybe you actually normalize your blood sugars, you normalize your blood pressure, you normalize your lipids, and you lower your risk of heart disease and kidney disease because of all of those things.

So it’s pretty exciting and there’s a ton of money being poured into all these different GLP ones and other aspects of the entire I guess intestinal endocrine system. The GLP one, the glucagon-like peptide is just one, but there’s also GIP gut intestinal peptide and, blocking glucagon itself.

It’s it’s interesting. It’s cool, but there’s a lot of research being poured into it. But in terms of anti-aging, that’s the only thing that I know of that sort of reverses aging because, those are chronic diseases. [00:09:00] To not have them would be pretty, pretty phenomenal.

joe: I mean, I think the research on all of the diseases associated around aging, mental, physical there’s a lot of progress that’s happened. And we have seen life expect expectancy increase, with more modern medicine, from 30, 40 years to now, 70, 80 years.

So a significant increase there, and I think that will probably continue to increase. And with aging, you have things like, as telomere length, so you have caps on your DNA that, shortened as you age and Dolly the sheep that was cloned and you go, oh, it’s a new clone.

That’s age zero. So you took it from an adult, sheep made a, baby sheep, that’s identical. The thing that was interesting was that the telomere length actually stayed the same as the adult. It didn’t reset. So that’s because you clone yourself.

geo: does that mean then it then they’ll [00:10:00] age faster?

joe: not fasterr, but technically their cellular age would be equivalent to the adult, but they’re a youth.

geo: stinks.

joe: yeah, it does. So that’s

Davis: I would stink, 

Right. 

joe: in, in,

in sci-fi fantasy, when you get clone yourself and you go, oh, it’s a younger clone, but really genetically in cellular they’re probably the same age as the, among other handwaving things we have to figure out with cloning.

But

nick: So would you have to take that bit as a younger age?

joe: Yes. That’s the idea that you would, if you could harvest cells, which people are doing that at a young age, then you can create. Now organs you can create kind of body parts. That’s the idea

geo: You replace your body part with a younger body part.

And it’s based on your own DNA, your own.

joe: So this kind of idea of personalized medicine, feeling that’s way outta my price range.

Yeah, that’s, I was gonna,

geo: I don’t think my insurance [00:11:00] covers that.

joe: I was gonna, I was gonna and you on your Davis, not together, but on your about page, I’m gonna you go check out Davis’s webpage, put the link on the website on the newsletter when it comes out.

But you have, if the insurance lets you do it like, so it’s this whole idea 

Davis: I have to plead the fifth about insurance companies or every seven all those seven words, I’m not allowed to say will come out. I cannot stand

nick: How trash they 

Davis: say that much and end it there. that?

nick: I said how trash they are. Oh.

I think

geo: but either confirm 

Davis: have no 

idea

joe: But, and not to get you in trouble, but I think you do get into some really interesting ethical kind of questions where you do get, money starts to come in and scarcity of kind of these technologies and a have and have nots.

Like who actually gets GLP? You mentioned that there’s a lot of insurances that won’t cover that if you’re not diagnosed with pre-diabetes or diabetes. And so that’s ’cause that’s its primary,

Davis: is only, it’s only for type two diabetes right now. [00:12:00] It’s supposed to also be for fatty liver disease. For one of them. One of the GLP one’s got that indication. And we can get into the weeds of those kind of things, but you’re right, they’re not, unless they’re also covered for weight loss, but not all insurance plans cover weight loss medications, particularly GLP one agonists.

So yeah, it’s, they’re great drugs for lots of things off label, but you know.

geo: But then 

Davis: write for them and then they go to the pharmacy and it’s $1,800. That’s like, who can afford that?

geo: There’s reality that going

joe: too far, I mean, you just talk a little bit about what GLP is for. A lot of folks aren’t in the

Davis: Oh, I’m sorry. 

It was actually originally discovered, was actually discovered back in the 1990s. It’s called glucagon-like peptide one. It had and. In humans, it normally is metabolized within a few minutes. But what we found was that it has a plethora of different effects.

It slows gastric emptying, it potentially increases [00:13:00] insulin release, and it seems to have an effect at the level of the brain to increase your sense of satiety. So you don’t want to eat 

as much. And it also causes a little bit of nausea, and in some people a horrific amount of nausea and other GI upset.

But in terms of the good effects, it’s slowing gastric emptying, increasing insulin production from the pancreas, and increasing your sense of satiety so you don’t eat as much. There may be. Other things that we’re still learning because there are receptor sites at other locations where it might have a beneficial effect in terms of cardiovascular health in a direct way rather than just through improved blood sugar and weight loss.

It might have direct effects at the level of the kidney, but the first GLP one that was synthesized was actually was discovered in the spit of the Gila monster,

geo: The

nick: what?

Davis: the spit of the Gila monster. You know the

joe: The lizard. The big lizard. The Gila [00:14:00] monster. 

nick: Oh. 

joe: Not a Sesame Street character. What do you think?

nick: I’m just so that was in

geo: Wow. And then to discover that I’m really curious how that

nick: they licked it.

Davis: I have no idea how they discovered it, but that was exenatide.

geo: I’m like, oh, I think I’m gonna check this.

joe: I could have been doing research on the he a monster. I mean, I think there are a lot

Davis: that’s

geo: But then to translate that into, that’s fascinating. There’s a

joe: lot of cross species where you’re looking at another organism and then you discover something like that, and then that translates

to Human kind of biology.

So yeah, it happens a

bunch. 

Davis: I think that would be interesting if that’s how potions work too. Like magical potions. Like somebody is trying to figure out how to heal something and they’re like, oh, the sharks heal pretty nicely. There’s there’s arctic sharks and

joe: I think 

Davis: grab something from them.

joe: that gets to you 

Maybe Nick’s point about that line between science and [00:15:00] myth and kind of fantasy

nick: where it all sort of blurs

joe: sometimes you have something where you really don’t understand the science a hundred percent, but it does this function, and then you can manipulate that to work and you don’t necessarily, like potions, you don’t necessarily have to understand all of the science around it.

But it will heal up the bones or, a

nick: this makes you feel better.

joe: Yeah, exactly. 

Davis: think in some, either some video games, some literature, that is how these potions are created. They take these different plants, fungi, whatever, moss, and they just know that if you mix them, ’cause they have certain properties that they’ve investigated, and if you mix them, you heat them to the right temperature.

It’s basically chemistry at that point. But they call it alchemy. Then they create this tablet or this potion or whatever that has the properties that they want. It’s it is interesting how science has seeped into that in a lot of ways.

joe: Yeah, no, and I think that was one of the, always one of the issues [00:16:00] with deforesting these star, these forestation in the jungles and things like that, that there’s a lot of botanical species, insect species that we just clear through that have been human, humans have devastated, which potentially hold pharmacological compounds that might be useful in disease prevention therapeutics and these functions.

And so it is, it’s one of these where you have a lot of these e eco ecologists going in and then taking samples to actually process later and see what compounds are in or what they do because tribal. Communities have been, they didn’t suffer from this disease.

Why not? Oh, they are used to, they may tee out the bark of this plant. And now it’s gone. So now no one knows why. 

nick: So on that one I was watching the show’s Common Side Effects. It’s on Adult Swim where this guy comes across a mushroom that ends up being like, I heal all [00:17:00] video game style, mushroom. I don’t know what you’d call it, but it was a whole story about the big pharma not wanting that to become a thing.

Is that like an, is that something that they would, I don’t know if you could speak on this, but is that something they would put a stop to.

Davis: \ So if there was a naturally occurring substance that could. Do the same thing, let’s just say of of a GLP one help you lose weight. And you don’t have to take it in a tablet form. You don’t have to have it created in a lab. It’s just something that you can grow in your garden, right? Don’t think that they would want that to be available to the public.

I don’t know if they could control it, but they, I don’t think they would want it available to the public because your best health has always been and always will start in your kitchen, not in your doctor’s office. It’ll start in your kitchen and what you eat is gonna be the most important determinant

nick: So we’re going back to witchcraft over here. 

Davis: Yeah, we are. That’s right. Witchcraft.

They [00:18:00] knew it.

right from.

nick: it. 

geo: But Right. But then, yeah, but then they’re not making money

Davis: So that’s why there’s not a lot of research that is done on a, I mean, the amount of money spent on pharmaceutical research compared to research done on just proper nutrition it’s it the pharmaceutical research just dwarfs the research that’s done on nutrition. And it’s because you can’t patent a diet, right?

You can’t say, this is mine. Nobody else is allowed eat this kind of food the way I do.

joe: Yeah.

geo: So really medicine of the future can just be. Some really simple basic

nick: soup. Chicken noodle

joe: chicken

geo: Yeah. I mean,

joe: So I and Nick will, last night I watched Idiosyncrasy.

Is that, is it Idio with the, 

geo: what’s his name?

Mike 

joe: Judd.

nick: Oh, Idiocracy.

joe: Is it Idiocracy?

Is it Idiocracy? Okay. There it

geo: What did you say?

joe: Idiosyncrasy.

nick: Idiosyncra. I don’t know what that 

joe: I don’t know what that is. 

nick: That’s why I was like, 

Davis: not quite the same.

joe: Yeah. [00:19:00] Idiocracy.

nick: that is such a fantastic film. But

joe: this point about where they were using the Gatorade, on everything. And it was like the electrolytes,

nick: It’s what plants crave.

joe: so it’s that whole idea.

’cause they bought the cd, the CDC, the FDA, and then they made it like their company logo and it was like, just everything and the, yeah, so 

nick: I love that film, but it scares me so much. Yeah.

joe: But that’s to this point here about industry and, capitalism converging and the good of that and that I gets to that ethics question, like, when, where is the line 

geo: Right. And it gets back to the money thing. Yeah.

joe: So Yeah.

nick: Yeah. WWW with the way they were doing stuff in that film, having everything so commercialized and very capitalistic was absolutely bonkers. Like the way that it does simulate what we are doing here in America [00:20:00] now. It’s,

all very like, okay, you can have this, but we’re gonna do this and then we’re gonna brand it this way.

geo: So it’s all about the spin.

joe: Yeah.

So

Davis: It’s also all about commodity, right? I, it’s like what can you make off of somebody else’s whatever problem or issue that they have?

joe: Right. Yeah.

I mean, we touched on pain and football in reality or sports, but in fiction, how’s that? I mean, how do you navigate that, squared at that circle with your characters and they’re dealing with, 

Davis: So there is a certain amount of hand wam. ’cause I do injure my characters and I do want them healed and I do want them functional. But sometimes the healers don’t know what they’re dealing with. Had a patient patient, not a patient. 

nick: In your 

Davis: I had a,

joe: Right. 

nick: patients. I get it.

Davis: I had a character who had [00:21:00] hyperthyroidism and it was it was, there’s a type, there’s different causes of hyperthyroidism.

This one was triggered by basically a cold.

joe: mm-hmm.

Davis: And so he was unable to train whatsoever because anytime he tried to exert himself, his heart would just race to, 180 beats a minute. And he had no stamina and nobody knew how to fix him because they didn’t understand what was wrong with him.

And the healing didn’t work ’cause they didn’t know what they were trying to heal. This is the handwaving part. One of my, I almost said patience, again. One of my characters recognized the symptoms and she was able to heal him of of what was going on with his thyroid. It actually was a fun little thing to write it in that way.

And it’s a it’s also cool because. We actually don’t have a cure for that particular issue. It’s you have to just wait it out and it’ll, you’ll get better on your own. But she healed him of it, so she sort short-circuited [00:22:00] the process and got him better, much quicker. So that’s part of what I do is I want to use diseases that are a little bit esoteric, but something that I know about and that the reader might not know about and even the healers might not know about.

And it’s not just, whoosh, here’s the spell and everything’s cured. It’s whoosh. This is the spell that will fix this particular problem because we understand what the problem is.

nick: So are there any side effects for using a potion? Or a cure for something that isn’t the correct diagnosis or ailment.

Davis: I mean, I think that would depend on what the author or the creator. Wants to do. Like if they want to give you a penalty for doing something wrong, 

joe: Right, 

Davis: then yeah, there should be a side effect. Or if they’re like, no, that’s just gonna slow down the plot, then there, it’s just gonna be an [00:23:00] effective, ineffective treatment.

joe: I was gonna say like in, maybe not in fiction as much, ’cause you’re right, you are limited by word count what you’re gonna throw in there. But you know, in the video game LAN is the nod to Nick here who probably has played more video games than I have in the last decade or so.

nick: In the last week I’ve played more.

Yeah.

joe: Yeah. 

Davis: I think you probably have played more in the last day than I my entire life.

joe: But yeah, I mean, ’cause I, I think of older video games like in that the eighties, and Wolfenstein, a Doom Descent, like where you had a health bar, you’re going around, you’re collecting med kits and things like that, and you go, but , video games now that are multi-dimensional, character driven.

I mean, you could start introducing some of these things where you, a you’re, to your point, you’re penalized for taking the wrong potion or picking up the wrong rusty a needle you find off the ground and jabbing yourself. But you can also have long-term [00:24:00] diseases maybe that you are suffering from repeated injuries and the mental kind of strain of that.

And so you really need, you need a different type of healing all the way to, could you have a more long-term. Diseases that are affecting you. Your sy I mean, I don’t know. Is

nick: Joe, I’m gonna need you to cut this bit out so I can write it all down. 

geo: We cannot 

nick: let this go out.

Davis: I think that would be actually pretty cool. Like, if you’re injured in whatever way you’re injured. You took the, like in the real world, if you have some sort of disease, like your urinary tract infection and you were given the wrong antibiotic, one, you’re still gonna have a urinary tract infection and two, you might end up having a secondary infection.

Not likely, but it’s possible. You’re certainly gonna not feel great ’cause you’re, you might have, a bad reaction to the drug that, that didn’t cure the initial disease. So I think that would actually be cool that if video game developers [00:25:00] or authors incorporated the mistakes of medicine into the healing so that there is penalty, I think that would actually be pretty cool.

nick: I think that would be,

geo: of any examples

nick: I can think that would be good in like a d and d session, like Dungeons and Dragons would be a very easy start for that. Yeah. But as far as video games go, as of right now, I can’t think of any because it’s very, this one thing is gonna help you.

geo: Unless you go

joe: to carry on through, I mean, a lot of these games are mission driven and they, you’re just, you’re checking off the boxes, 

geo: and do you have so many lives that

nick: much anymore.

I was 

geo: say, is that even a thing?

nick: that was a thing more of the 

joe: the pass. The pass, when I mean, 

nick: so 

joe: when the games I played, you would, you got three 

geo: Right?

joe: There would be a Turkey leg along the way that you would eat or an apple, and that would, that, that was your bonuses for making it through to levels like Streets of Rage I’m referring to, which is one of my favorite, the Sega Genesis Theresa Rage.

one,

two and three. I spent a [00:26:00] lot of time on those. But yeah you would have, it would be a random chicken, a chicken on a road and you just, they eat the whole thing, bones and all, and your health bar recover. So it’s always yeah, you always have that. But I was, when I was, when you mentioned UTIs, for whatever reason I thought of Grand Theft Auto.

Like that would be where you would

geo: like, would that be the name of the game? UTI

joe: Ut that 

nick: such a good game. UTI. What? 

joe: Hey mom, 

nick: can you get me UTI for Christmas? We

joe: the

UTI. 

Davis: Yeah. Urinary tract infection. Those are always fun.

joe: Yeah. No, I just I think of it because it was our oldest son and he he wanted to play Grand Theft Auto with his friends, but we were like, I don’t think you’re old enough to play that game. And so he said, no, let’s play it. I’m gonna set it up in the living room. I’m gonna play it. You’ll see it’s not what you think.

And I was like, oh, I think it’s everything. I

geo: friends to join in. He

joe: got his friends to join in. But to start the game, you have to like perform a drug [00:27:00] deal. And so he’s trying to do it. Yeah.

geo: or you have to, yeah, you have to rob someone. 

joe: He’s

trying to do it like 

geo: and he goes, look, I can get a job and I can like earn points and stuff.

I’m like, 

joe: are sitting here for about an hour and we’re like, just please just rob the bank. And so he robs the bank and the game starts in earnest and then he is there and he starts, oh, I’m gonna get a job. And I’m working at a store. He’s on his all this on, he is got a, and then just like, Porsche or something pulls up and they’re like, Hey Max, we’re glad you’re in the game.

And then and then they’re like, he is like, yeah, he is like, oh yeah, I’m going to my job. And they’re like, no, I got this. I beat up this old man. We took his condo and da. And so we’re like yeah, I don’t know about this Max. So it was

nick: I got my nine to five to go to guys. I can’t, and I’m tired after do it. So So

joe: I just, I thought

there with the real isn’t quite as fun of game,

having these kind of disease modalities, that would be that type of interface where you’re [00:28:00] there where.

right

geo: Aren’t you playing the game to escape all that?

joe: Yes,

nick: There are some people that are not like the amount of role playing that goes into those games.

I, I know a ambulance driver who. Plays as a policeman in the game. He goes, yeah I go in and I’m just, breaking up fights in the game. And I’m like, why? Yeah, why are you,

joe: are you

geo: Because maybe it’s all the stuff he wishes he could do in his regular job.

joe: Is it Nick Cage? Yes. Yeah.

nick: I’m not even gonna attempt to done

joe: He ambulance driver, was he? Yeah. In Was it?

Yeah. What movie was that? Bringing

the Dead? Waking the Dead, or one of those?

Davis: we, are we talking about Nicholas Cage again or

joe: Yep. Yep.

Davis: There was a one of my, so did you go to the did you go to the oh, what is it called at Dragon Con? They have this big parade with all these people. 

Cosplaying. Did you go to that, 

joe: I saw a part of it. I had a panel to get to. Like, while it’s like

Davis: [00:29:00] so I, I’ve seen it twice, but the funniest thing I ever saw was this group of people dressed up as Nicholas Cage from all his different movies,

and they just had like this little cardboard face on their, your, this

geo: Oh my God, that would be so amazing. 

joe: would be fun.

Davis: It was hilarious.

joe: that would be fun.

geo: would be so I missed that. No 

joe: I saw a part of the parade. I had to make my way through it. ’cause I was like trying to, get to a panel. But yeah,

Davis: Yeah. It’s a lot of fun.

joe: is, Dragoncon is a it was a ton of fun. So

geo: maybe we’ll get to go sometime

nick: I know we weren’t invited.

joe: You were invited.

Everyone’s invited. I wasn’t.

Davis: You should go. thing that just hit me is it’s not really an anti-aging that you guys are doing anymore, but it’s more of a life extending,

Yes.

like it’s Yeah. it’s not just life extending what you’re trying, I wanna see is

Quality, of yep. Exactly

So it’s not just you live more years, but the years you have, you can [00:30:00] still do the things you wanna

geo: That’s huge. Huge. If you want,

nick: a pain or suffering

geo: now. That’s huge.

Davis: right or limited because of constantly having to go to the doctor or co or your health just doesn’t allow you to go up the, go up a flight of stairs or go for a walk with your grandkids.

That’s what you want. You want those kind of qualities where you can go with your grandkids to the Grand Canyon if you have the money and show them the Grand Canyon and you’re not bogged down 

nick: Not

just showing it to them on the tv. Man, look at 

Davis: Yeah, there you 

nick: Grand Canyon right here, ain’t it? A Butte.

Davis: really cool in real life too.

joe: And I was gonna say some of the, that, the other thing about medicine and is that’s changing a little bit is disease prevention and that goes to this quality of life.

Like to actually not wait until you’re actually in disease state. 

geo: Goes back to the thing about nutrition and how you’re just living your daily life and how much that is a matter of prevention

joe: and [00:31:00] working out.

I think we under es estimate our activity levels and we get very sedentary in our daily lives and I think moving and being active.

Davis: Yeah. I mean, our bodies were meant to be used not to sit at a desk all day. I think you’re absolutely right about that. We’re healthier when we’re moving.

joe: And we talked about on other episodes where, we’ve evolved to be, these kind of long distance creatures that can stalk prey.

I think in the heart of the superhero episode, si he was talking about that. And in the performance episode we had, we talked about those things about how just a human body we’ve evolved to go long distances and to, endure. Through that,

nick: And now we’re just hunting deals on 

joe: hunting deals. Yep. Getting our thumbs of work out.

that was 

nick: a hard time

Davis: Or looking for the closest Chick-fil-A.

joe: That’s

nick: man. I gotta go on a scavenger hunt to the [00:32:00] grocery store.

geo: I always think of Wall-E 

joe: yeah. 

geo: I always think that, like, that being the future.

nick: Is that your movie of the season?

geo: Yeah. I just feel like, like we think that, I think that’s just that’s exactly what’ll happen if we just continue to just look at our phones, our little screens and just ride around on little motorized vehicles.

You know what I

nick: Do you have one yet? No, they’re coming in the mail.

joe: A motorized 

nick: Yeah. The little 

joe: A scooter. Oh,

Davis: A little hovercraft. Yeah,

joe: Yeah.

With a 

Davis: go play golf. 

joe: can

watch.

That’s right. Yeah. 

nick: I mean, you could still bowl and it won’t make a difference.

geo: and you’re totally distracted just looking at your little screen and you don’t even know at all what’s happening.

joe: I think we talked about character in game and all the attention or protagonists, the attention’s usually focused on them, but I think in real life we are moving towards personalized medicine and 

Davis: That’s the goal.

joe: that might be,

Davis: it’s.

joe: Yeah.

Davis: Whether it’s cancer therapy or [00:33:00] anything, you’re looking for medications that will target that particular cancer with particular receptors and no other receptor sites so that there’s not a bunch of side effects that you have to deal with. And we’re getting much better at that every single year, which which I think is like, one of the best things that so when I was in training, the whole thing about chemotherapy with cancer, it’s like, what’s worse?

The chemotherapy or the cancer, know? 

And now when I see some of my patients they go to the oncologist and they’re on their chemotherapy agent. They’re tho those side effects are still present, but nothing like they used to be. it’s just been wonderful to see.

joe: Yeah. I’m even thinking of pushing beyond that where you were talking about some of the GLPs and how per person one can have, very intense nausea and some have very little. And so this idea of personalized medicine really personalized that each individual,

the 

geo: a different formula.

[00:34:00] Exactly. 

joe: it would be tailored to their genetics, their cellular makeup.

So that when you give it to them, it, the dosage, everything is so finely tuned that it, it does that. And we’re, and that’s part of this longevity

geo: And I feel 

joe: of life. These things are starting to come where

it’s gonna be. help 

geo: but be cynical and think going that, going back to like the dollar

joe: Yeah. That’s,

geo: Who’s gonna be able to afford that kind of,

joe: Un unfortunately,

nick: I

feel like that wouldn’t be that bad though. ’cause if they, I know we’re gonna bring it up, but AI if that goes around and is supposed to be personalized to your, like genetic makeup, they can have these, alright, this is what the formula is, what is gonna be the perfect ratio for this body.

joe: I think people are working on it, but to as well, to George’s point and Davis, you can jump in, but to George’s point, I think some of the drive is [00:35:00] about money.

So to entice research, especially as government funding is now amorphous and weird, we’re in some weird state. That means private sector has to pick up the tab and they’re only gonna pick up the tab if there is profit, right? Because that’s, we’re all a good, we’re all raised good American capitalists.

And so that’s the way it works. And so with, to your cynicism I think it’s well-founded that the driver of the technology is going to be people who can pay for that technology and a pharmaceutical company see a buck in it, and then after they make their money back, then they will.

Start to lower price or as technologies develop. So things that might be difficult and expensive that comes down. We sequencing the genome really expensive. The first one was really expensive, took a long time, and now you can take a cheek swab and mail it in and get genetic information back. So that’s what happens.

But you need someone to [00:36:00] say, I can make money in this industry, and then they do it. And that’s the way a lot of progress unfortunately has propagated, was can I make a buck off this? And if I can, then let’s throw two bucks at it. If I’m gonna get three bucks back, if I’m gonna get four.

So it is 

Davis: I mean, that is, that is, how pharmaceutical research occurs. It’s driven by profit. There’s nothing necessarily wrong with that, but you don’t love it when you know that somebody can be of, can be, can benefit from something and they can’t afford it ’cause the drug costs too much. And I don’t think that’s because of the pharmaceutical industry.

That’s actually because there’s three layers of bureaucracy between where the drug is made and the pharmacy where you pick it up. And all of those bureaucrats or those companies that really, I’ll stop there

nick: I feel like this is our most anti-capitalism episode ever.

Davis: So I’m not against pharmaceutical research or capitalism, I just, I feel [00:37:00] for my patients that.

I can’t afford drugs that I know that they could benefit from. And I know where the problems are. A drug company actually provides benefit. They’re developing stuff that can help

geo: right. 

Davis: but there’s other players that get their fingers in the pot that I don’t think they really, shouldn’t be there.

joe: yeah. And to your point, Nick and Willie, but , AI, I think at that kind of the cutting edge of technology, especially drug discovery, we’re starting now to see a lot of AI look at protein structure, try to,

Davis: where I think that’s gonna be where the personalization comes from, and that’s my hope for why the limitation on cost or affordability, I should say, won’t be as much of an issue because the thing that. I would hope the AI can do as far as pattern recognition eventually. If it’s as good as we are at pattern recognition, it’s gonna be able to sweep through [00:38:00] receptor sites.

’cause receptor sites, they’re 3D hodgepodge is that are really amorphous and it’s really difficult to understand what’s supposed to fit in that thing. It’s hard to visualize it, hard to model it. And if you have a program that can model it and it can look for that particular receptor within somebody’s cell wherever it happens to be looking. If that cost comes down, which we hope it does, since it always seems to come down with technological innovation, then at that point you would have compounded medicine that’s specific for that individual, which would hopefully be far less expensive than some pharmaceutical company producing a huge amount of the same drug where they have to make a

joe: I, I each vial. 

using, so Open AI is the one that, that does protein structure. And as a structural electron microscopist this is something where [00:39:00] you talk to pharma folks and the AI models are nice. But they still have to, go to the experimental, I think that’s still, we’re still building all the experimental data up that you start seeing, errors in the AI models.

And so it’s really now this handoff, we get a model that’s close, go to experimental research, refine it, actually figure out the right structure the ligand binding sites, and then go to AI and go, what can fit what actually makes sense to go into this ligand binding site make a hundred, go through pharmacology and just make wild stuff.

And then we can make that synthetically to fit in that pocket. And so I think then you build that up. Then you can look at what are the differences in your protein structure and Nick’s protein Davis mind, and then go, okay, these are the same, but your pocket’s a little bit. And this, like, this key doesn’t fit in [00:40:00] this lock as well.

We need a different lock. And I think that’s what you’re getting at. Now. We can find the lock the key to the lock a lot faster if we know, we can get to that protein structure. ’cause that now has become faster to get to. And now can we find keys to the locks faster?

geo: And a totally unscientific thing popped into my head when you’re talking about this, but it makes me think of like, like publishers going to like print on demand versus just printing like thousands and thousands of copies and then we’ll see how many, but no, this specific person wants this specialized personal thing, you know what I mean?

So you’re only gonna print it when they demand it. So like that with medicine.

joe: No, I, yeah, sure. I know.

that works. No, I, yeah. Anything. But I mean, the thing that you fear is that personalization, because it’s unique to you, will cost more.

And it’s like, how do we bring down the cost [00:41:00] of this 

geo: But you’re only printing it when you need it. But do you see

Davis: print on a unit basis a print on demand is gonna be a lot more expensive than if you do a large print run.

geo: oh, 

Davis: for instance, if you do a, a print run of like 5,000 books, each unit might cost 

$3 to print, 

Print on demand. Each unit will cost around if it’s trade paperback, it’s gonna be around.

Depending on how

long it is, it may be anywhere from eight to $12.

Yeah, I know it’s, 

geo: so it’s the total opposite of what I just said. Okay.

joe: Yeah. Personalization usually drives, yeah. Because you’re making something that’s 

Davis: I it’s like if you hired somebody to build you a car versus going to a dealer and just buying one of their cars.

geo: Yeah. Anytime you customize something,

joe: Yeah. And then you have all the people that were involved, so that didn’t add tax on to that final

Yeah. Your final cost becomes, yeah. Just higher. Yep. 

Davis: But I mean, those are different sort of [00:42:00] examples I think because there’s a lot of material cost into building, printing a book or creating a or making a car. There’s just a lot of material on that, whereas there might not be quite as much material cost to create that medicine that one particular person needs.

I don’t think we have a, we won’t have a handle on that for a long time,

joe: yeah 

Davis: as to what it actually is.

joe: lot of the costs might be the first, because if, you know the drug works in this pocket receptor, it, it will do

what it 

geo: you figure out 

joe: then

making derivatives of that tailored to different people. That process should be going through all the, the checks and balances should be faster and easier for a drug company, making product, the derivative B is cheaper than making, the alpha.

geo: Okay, that makes sense. Yep.

joe: so that’s where you could that’s how you save

Davis: that’s the hope 

joe: and getting that so you don’t need to do as much chemistry. You don’t have to keep trying to reiterate and reinvent the [00:43:00] wheel every time. You can just jump to making, 

geo: got

joe: A different color wheel.

nick: So this is what AI should be used for.

joe: Yes. Among other things, I mean, doing taxes or something like that.

I can, 

Davis: Joe, you’re a writer, aren’t

you? 

joe: I do. Yep.

Davis: There’s a, there’s an AI program that I’m using that sort of, you can upload your PDFs into it and then it just lets you, it sort of uses that as a database where you can interrogate your previous work so that you don’t have, I have trouble remembering my kids’ names sometimes, and so 

I’ll call them the wrong names all the time, or I’ll call my cat, my, my dog’s name and vice versa.

So that is that. I’ve found that to be extremely helpful since, the series that has the fewest number of named characters in my series is at 155. 

So 

geo: Wow. 

Wow. 

Davis: there’s 155 named characters. I’m [00:44:00] never gonna remember all that. The most I think is 280 in 

one 

joe: I think there, I always, I mean, that’s the touch on this. I mean, not a little bit to do with this episode, but, I always say I think there’s different things and we always gotta separate that ethical, moral bit out of a lot of this, even medicine, but the AI creation of the large language models and how that data was curated.

There’s a great argument to be made. I’m glad to see some compensation starting to happen with the, it’s the endoscopic case that was, that’s settled philanthropic case that was just settled. And so I think that’s one discussion. But like a lot of these technologies, we’re not putting the genie back in Pandora’s box.

It’s out there. It’s gone. It’s loose. And so I think to your point, Davis is how to use the tools

to

best aid us in our craft. And that’d be medicine, that’d be science and not a replacement for the things that we enjoy doing, right? We enjoy writing for writing’s sake. [00:45:00] AI really is bad at that, so don’t use it for that at all.

But for something like this where you’re curating your characters and saying, I can’t remember, what character interacted with what character, and then it’s, it has this kind of, you fed it your own personal data. You’re controlling out what you’re feeding it, controlling that chat box.

I, I think that’s the perfect example. Of how these things, these tools should be used. The Nick’s point. This is how you should use ai in there.

Davis: where it’s it’s a wonderful feature and a wonderful aid to the creative process rather than a replacement of the creative

joe: Yeah.

geo: Yeah. So Davis when did you start writing your novels?

Davis: Oh gosh, I’m old. So I’ve wanted to be a writer probably since 1985, when I was much younger than I am now. But

things got in the 

joe: older than some of us in 

geo: retreat. I then notice how he looked at me.

Davis: Y’all look a lot younger than I do.

joe: I, I, yeah 

geo: [00:46:00] no.

joe: we can talk off offline, but Yeah. I think we’re, it’s surprising sometimes, so Yeah.

Davis: so I, things got in the way, my career going to college, med school, that sort of thing. And so I didn’t really, I tried my hand when I was a teenager. Really tried when I was about starting when I was about 38 years old and then published for the first time in 2014. I’ve been a published author for 11 

nick: Congratulations, man. 

joe: Yeah.

Davis: It’s a long journey.

geo: and always fantasy. Always.

Davis: Yeah it’s always been fantasy because that was always, it was science fiction and fantasy have always been my first loves. I would love to write science fiction but I’m caught up in a couple of long series, so I don’t know when I’ll be able to do that. But I would love to, to write Space Opera or write First Colony.

I would love to write a first colony story. And I just don’t know if I’ll ever have the time to do those kind [00:47:00] of things. But those are my br bread and butter. I’ve been reading a lot of Freedom McFadden lately, though. I don’t know if you know who she is, but she’s she’s very popular. Everybody in my office absolutely adores her and she’s they’re about to come out with the movie, I think with Sidney Sweeney the House housemate.

I think is the name of the book in the movie. But yeah, I’ve been reading a lot of hers and now I’m kinda like, gosh, the show would be fun to write a thriller too.

joe: Yeah.

geo: Right, yeah. 

Davis: and is actually a, is also a physician. I think she’s a neurologist by

geo: Oh, really? Yeah.

Yeah.

joe: Yeah.

yeah. A lot of,

Davis: So go Frida.

geo: That’s awesome. 

joe: it is. I’ll put that in the show notes.

nick: So what’s been your biggest inspiration for your books?

Davis: So I wanted to be a writer when I read Lord of the Rings. That was my biggest inspiration wa for wanting to be a writer. ’cause I wanted to create something that epic that, and some books by Arthur C. Clark was ki and also Robert Heinlein was what [00:48:00] Spur. That’s what I, why I wanted to write sci-fi, especially as Heinlein’s, a young explorer series of books, like The Moon is a Harsh 

Mistress. 

joe: good one.

geo: Mm-hmm.

Davis: And so those would probably be the biggest inspirations for why I wanted to be a writer. In terms of the books I’ve wanted to always have an homage towards would be Wheel of Time.

Um, I absolutely loved Robert Jordan’s books, even the slow ones that everyone complains about. The middle book syndrome, I guess you could say.

But those were the books that I always gravitated towards those big epics where something world shaking is about to happen.

joe: happen. Yeah, and it’s funny, the, you mentioned Moon is a Harsh Mistress, but that one, it had the AI component on there that, that helped the the moon colonies fight for independence. So I thought, it was all through telephone lines, kind.

It’s been a while since I read it, but I still remember that. And it was one of these, books. I mean, that was the early de sentient self, the [00:49:00] computer had self-agency and was intelligent and then made a decision of who they thought was right and wrong.

No, it was a great read in there. Cool.

geo: And just like a question more on like process creative how do you find the time? Because it sounds like you have quite a few books and series and then also to have such a, i what’s the word? But your job, 

nick: a 

geo: a, right?

Your job takes so much. How do you find the time to.

Davis: It’s it was easier a few years ago before COVID where the stress of just being in healthcare wasn’t quite as much. So I’d have plenty of emotional energy when I got home. I’d be able to spend time with the family and my kids, and I’d have a few hours of writing that I could do. I didn’t read much and I didn’t watch much TV at that time. And that’s really how I found the time to be a writer was because I sort of had to [00:50:00] sacrifice entertainment because I was wanting to write the books that I wanted to read.

And that’s how I found the time. It’s been more challenging more recently to get that balance, just to be emotionally ready to write at the end of the day compared to how it used to be. hoping that’ll

change. It’s one of those doctor healed, I sell things, like stop stressing so much.

nick: Then

joe: in your writing, I think you touched on it, but the inner kind of the real medicine, fantasy medicine how do you strike that balance in your, in your own works, 

Davis: want injuries to feel real in terms of recovery, even if it is, the recovery is shortened in terms of the physical healing, I want the mental healing to, to occur and not just be hand waved away. So that’s one of the things that, that. I draw from as a physician, is that you have to recover, you have to [00:51:00] grit your teeth and fight through to do physical therapy.

Like if you have a knee replacement , that’s the beginning of your journey. That’s not the end of the journey. The knee replacement, the surgery is the surgery, but afterwards comes the physical therapy and that’s where the work happens. And so when my characters get injured, I want them healed, but I want them to work for recovery. and so that’s part of what I, how I think about things. Other people. It’s, and that’s not to say that you can’t just hand wave it and say the person’s healed and off they go to do whatever they need to do. Because it all depends on what kind of story you’re trying to tell too, right? I mean, when you have those video games where you just have to eat a loaf of bread and you’re get to go that the game developers don’t want you to slow down 

to, to, heal.

They want you to be able to eat that loaf of bread and then get right back into it. And that’s the journey they want the player to involve themselves in [00:52:00] rather than what we were talking about earlier. Like, that was a, that was the wrong loaf of bread that didn’t do anything. That was,

nick: that just made me 

Davis: had a protein, A protein meal.

joe: You’ve lost a

Davis: That’s not the journey that they want you to take. So it, it just depends on what you want. But that’s what I want.

joe: Very good.

Awesome. We’re coming to the end here. Nick. You got something? You got, it looks like you’re,

nick: I’m trying to piece on how to put it together. What is something you think everyone should do to be healthier? You mentioned that having it starts in the kitchen.

What is one thing that you’re like, oh yeah, this is something that I would recommend people do to help with this? 

Davis: Well 

nick: I’m looking for free medical advice, is 

joe: how do people live forever is what he is asking though.

Davis: that is a good question. I did mention there’s not as much research done on nutrition as there is on pharmacy or pharmaceuticals. And one of [00:53:00] the things that is challenging is that there’s all these diets that have come whether it’s the Mediterranean diet, the keto diet, high protein diet, or Atkins diet, whatever you want, all these different things. What’s interesting is that and Joe was talking about this before we were all talking about it, same diet is not necessarily applicable to every single person in terms of their health. For some people, for me, for instance a low carb diet does nothing for my glycemic control. It doesn’t help a high protein, moderate carb, lots of exercise.

Does wonders for And so it, giving nutritional advice isn’t really something I can give because I, that’s something you almost have to discover. But there are some basics. Processed food is not good food. So if you go to the grocery store, stay along the edges, not in the aisles where the food is processed and [00:54:00] in a box.

So fresh food is your friend not the food in a box. If you get the majority of your food along the edges of the grocery store, that’s where you should get the majority of your food. In terms of exercise, it, it depends on what you’re trying to do, but a half hour of walking every day is a wonderful thing.

If nothing else, it’s almost meditation. And so that’s always a good thing to, to have that space to. Let the stress of the day sort of empty out of you if you can. If you wanna maintain muscle mass as you’re getting older, weight training is fantastic. If it’s done safely, you don’t need to, try to lift like Arnold or Ronnie Coleman or something.

You’ll actually probably hurt yourself if you did. So moderate weight training would also be good. But if you can’t do any of those, shop at the, on the perimeter of the grocery store and eat fresh foods 

nick: Oh yeah,

joe: I [00:55:00] got a lot and really interesting to tether earlier that, I talked about genetic sequencing, but some of that’s now tethering into nutrition and kind of your own genetic background. And as that becomes a bigger thing, getting that information that can guide your nutritional, like what should your diet be? How should you eat? And things like that. So really there’s a lot of convergence of these ideas. But

geo: yeah,

this and this, I’m trying to pull this up outta my mind. This is coming back to like memory issues and that type of aging, but this guy, he was spending like, he spent like $2 million.

He’s like trying to like figure out how he can like, live much, much longer. And, but it was interesting some of the things that he was doing and one of the things was. He was taking his sleep so serious, like he was like gamifying it. Like, okay, like I have to, and all these different techniques to get better sleep.

And [00:56:00] that’s huge. I mean, that’s just another part of it.

joe: And there’s also genetic factor in that they just found that there are people who only need four or five hours of sleep genetically, that they’re coded that way. And there’s other people that are the extreme other end that need eight, nine hours of sleep.

And so once again, it’s like nutrition. People give the average, but really you have to listen to your body, I think. And I, and go with it. And, even disease states and things like that, body wisdom is a thing that, I don’t think people get enough credit to you that you know what your body is doing, how it feels when to speak up and advocate for yourself.

But yeah, so what I didn’t mention is the. Oldest piece of fiction with healers in it. Nick, you wanna take a

nick: I’m going to say it goes back to one of the first writings where it’s on a scribbled on a cave wall where they’re like, Ooh, eat this. It makes you feel better, right?

joe: It’s our to [00:57:00] Epic of Gilgamesh.

nick: I’m gonna say that you’re wrong. It’s gonna be cave drawings.

joe: Yeah, no, that’s, it has healers

Davis: What was it in, what was it in Gilgamesh that they talked about? I don’t remember

joe: it was the the link, it was plants and herbalist and healing that, that relationship to that. But yeah, almost a lot of

geo: And what was the year 

joe: in there. 

geo: that?

joe: At 2000 BCE yeah. Yep.

geo: been a while. Is that 

Davis: to be 

geo: can get print on demand? 

joe: print?

Davis: supposed to be a hieroglyph. One of my one of my attendings, he’s he’s from, he was from Egypt. He was telling me that there’s, there was a hieroglyph from one of the pharaohs one of the sons of the Pharaohs died, and the inscription or hieroglyphic said he lost his life through his urine. And so the question, the, what he thought of, ’cause he is an endocrinologist, was he had type one 

diabetes. And so with type one diabetes, your blood sugars are really

joe: high. Yep.

Davis: You’re constantly thirsty and you’re urinating [00:58:00] all the

time and you waste away through your urine. So he thought that was the first documented case of type one diabetes.

So I

nick: That’s hilarious. I would not have thought of that.

joe: Yeah, that Egyptian mythology at 1500 BC also talked about the healers and healer goddesses and things like that,

nick: I’m pretty sure mine was right though. Cave drawings.

joe: Oh, I don’t know. I don’t think so.

nick: We’ll put it in the show 

joe: I said fictional writing, so that’s a drawing yeah.

nick: yeah.

joe: Yeah. Yeah. Potato. Potato.

Cool. All right. Davis, you wanna tell folks about your books? I know you, we had a little bit of segment here at the end about your writing process, but just in where we can find you online. 

Davis: Sure.

So I’ve got a set of books that are all interconnected called the Anchored Worlds. There’s four series. They’re actually separate series. You don’t have to read any of them to any of the other series to appreciate the one you’re on. But they’re all interconnected, [00:59:00] kinda like, I guess Brandon Sanderson’s, Cosier.

The books are available on Amazon and most of them are also available on other stores like Cobo, Barnes and Noble, Google Play, Apple. And then the audio books are available on Audible. And I got great narrators. I got Nick Podell as as the narrator for most of my books, and Travis Baldry for for three of my books.

So I got great narrators.

nick: And we can get those personalized, right?

Davis: Yeah. Just send them to me.

I’ll send ’em back.

nick: Thank you so much for being with

joe: And you said you had a new book coming out in the summer, was that

Davis: Yeah I have a series called Instrument of Omens and four books are out right now and I am working on book five and I promise it’s gonna come out.

joe: Cool. Let us know when it does and we’ll send that out to everyone to grab it. 

Davis: Absolutely. 

joe: Davis, thank you for joining us on 

Davis: you for having me. This was awesome. I appreciate

nick: sorry about our earlier 

Davis: you having me.

joe: Yeah.

Oh, [01:00:00] that’s hopefully we didn’t get you in any trouble with insurance companies I work I work in a medical, institution, so I don’t maybe I

I’m

sure 

geo: you’ll be in trouble.

Davis: to say, but that’s okay.

joe: Awesome. So yeah. Got me, Joe.

nick: you got Nick,

joe: Got

geo: Georgia,

joe: We got Georgia

nick: and we went down some holes.

joe: Stay curious. Bye. Stay safe. Eat healthy, love y’all.

geo: Get sleep.