Lily Fierro (MIT Computational Cognitive Scientist, Writer) and Generoso Fierro (Illustrator, Film Critic)
Website: https://linktr.ee/plasticgrapes
Substack, Apple, Spotify, YouTube, Amazon
joe: Hey, welcome back to the rabbit hole of the research down here in the basement studio. We’re all crewed up. You have me, Joe,
nick: you got Nick? You got
joe: we’ve got Nick.
geo: Georgia,
nick: Actually Joe, can we start this from the top? I’m gonna put this in my chat box. Don’t worry.
joe: We’ve
geo: Yeah we’ve got a chat behind here.
nick: to run it through my ai.
joe: Don’t. You know, you guys are getting a little ahead of everyone here.
geo: Joe.
nick: Oh, sorry.
joe: Be good humans. The chatbots would not do that type of thing, but yeah. We’re gonna be talking about chatbots in this episode and we have two, two special guests. This is such a hefty topic. So if our guests will go ahead and introduce themselves.
Lily: Hi, I’m Lily Fierro. I am half of the Plastic Grapes Duo. And I typically write and letter and also panel our comics, and I make them with
Generoso: Generoso Fierro. And I do most of the illustration [00:01:00] for our books. We have four to this point and five possibly by the time this is up.
Lily: Yeah. And our books are all science related because I’m originally a computational cognitive scientist.
I, my bachelor’s is in brain and cognitive sciences at MIT. After that I worked in an adult research and I’ve been a data scientist for almost so a while now. Over a decade and a
joe: Yeah. Don’t age yourself.
Lily: know.
joe: Yeah. Cool. So I think this is gonna be a lively conversation. And so like the episodes, I usually have a little intro and I do have a few
nick: just to confirm, you guys are humans, correct?
geo: Can we verify
joe: Everyone’s human.
nick: Can you click the box to tell me that you’re are?
geo: me
Generoso: Abso
geo: many how?
Generoso: be I
nick: Yes. Yeah.
geo: Which pictures have a crosswalk?
Yeah,
joe: Yeah, I like the one where it’s you know, click all the boxes with a motorcycle, and then there’s one motorcycle [00:02:00] across like the panel and it’s come on, man. You know, and that
Lily: many do I have to
click?
nick: know, yeah. It’s
like I always fail
joe: usually, it’s like you’re not a human. Do it. Try it again. Okay.
Cool. So, I have a chat box is in one sense, a very simple thing. Matrices of numbers, processing power, algebra, statistics, predicting the next word, and the next. No heartbeat, no memories, no emotion. And yet we are all fascinated by how you can type questions into that text box. And it doesn’t feel like math.
When a perfect answer comes back, it feels like a human voice pulling from the entire database of human knowledge. It tries to please it jokes it argues, it tells you. It understands you. It hallucinates, it apologizes. All this makes it seem like it has a mind emotions, but it’s only math. But we wanna believe it’s more we are biologically programmed to seek out companionship and chatbots [00:03:00] are more than happy 24 7 to fill in. But chatbots are more than a simple conversational tool. They’re now customer service agents, coding partners, midnight, confidence dungeon, masters, fake friends, and sometimes fake lovers. They live at the intersection of science fiction, nightmares, startup promises, and the very real human loneliness. Us humans are wired to find patterns, intentions, and see working minds everywhere, especially when something talks back in a predictive and engaging way. So when we talk to something that has never been alive, never felt anything, but somehow knows exactly what to say next. What’s really happening in that conversation. Are we discovering something about these machines or are we discovering something about ourselves? And that’s what we’re gonna chat about. And
nick: that was sent through chat GPT? Correct. Sent through
joe: it was sent through Joe GPT
nick: Had to confirm.
joe: So yeah, that’s where I [00:04:00] think, , we can go and we can probably get into more definitions of what a chap by mal leave it to the computational scientist. , I have a definition there, but yeah, if you want if you want to, you can break it down. If I was wrong,
Lily: no, I think, you know, just as a survey of the architecture you know, most of the time when we work with chatbots in this day and age they tend to be decoders. A decoder type architecture is focused on, at the end of the day generating new content, right? We most chat bots are going to be generating text when they do that.
They’re doing that in an kind of auto regressive manner, so they’re looking at the entire corpus that is available. A lot of that is like whatever’s out on the internet, anything that any of these companies have managed to bring together, scrape. And they are predicting word by word based on that corpus, but also looking at the previously predicted word so that you get these generated sequences.
Yeah. Decoders themselves sit on top of like [00:05:00] very dense neural networks. Dense neural networks are fascinating and also a very interesting black box because we don’t really know how a particular task gets distributed across them. And yeah, that’s what kind of I think leads to a lot of the mystery and intrigue of what exactly a chat bot is doing.
Is it actually thinking that, you know, recently there have been reasoning models? Is it truly reasoning? And I think altogether the answer is still don’t really know yet.
joe: Yeah. I mean
nick: terrifying.
joe: Yeah.
nick: The not knowing of it, it’s oh, yeah, we, it’s just not known yet.
joe: Yeah. I think it’s a lot of technologies we’ve had where we didn’t really understand how the technology worked and that took some time for us to figure out.
geo: That’s so interesting.
’cause I would think technology, you would understand how it works ’cause it was created,
joe: I mean?
geo: That’s just, that just blows my mind because like our [00:06:00] minds, it makes sense that a lot of times, like they say, we still don’t understand what whatever percent of the brain, , what it does. I mean, that kind of makes sense though because Yeah, we just were born, you know, I think it’s been,
joe: I think early humans, when they figured out fire, I don’t think they understood what really went into the chemistry of making a fire. I think they just knew they had the basic ingredients and how to maybe reproduce it and not really understand that this is the, these are the three elements.
You need a combustible, you need heat, you need
geo: But to be honest,
joe: and that’s
geo: to be honest, Joe, I don’t know if the average person thinks that much
nick: that’s,
joe: well, I was gonna get to that because I think some of the mysticism of the chat box and the reason we jumped to, and I in the intro, I alluded to that, is I think a lot of people don’t really realize it’s just, statistics.
Like at the end of the day, it’s just predicting what the next word is gonna be. And then filling that in. But then the interesting thing, [00:07:00] the non-understanding is that those models you would understand as it predicts words and it would pull from information. But when it starts hallucinating, I think that’s when it gets strange.
And even now you feed it more information it seems to hallucinate even more. If you know, it’s one of these weird tipping points that it’s you know what I don’t really wanna learn anymore, so I’m just gonna tell you whatever. And really to tell you what makes you happy as the human companion, it just, it really wants to please.
And that’s I guess that’s good. Right now we’re not, we don’t have a Terminator, but, you know,
geo: so what’s really happening when the chat bot hallucinates and starts saying things out, out? Like, how does that come about?
Lily: Yeah. So from what I understand, usually, because It’s predictions are based on a kind ofs. a kind of spectrum of probability because you don’t want every prediction to be the same word by word. Because when [00:08:00] we speak quite often, we have that similar variability in our speech. And so by design they are going to, you know, have a little bit of you know, designed instability as you predict a word.
I think the, what happens with hallucination it, there are a bunch of different kinds of hallucinations and they all qualify into like different types of falseness. So you can have scenarios where maybe the synthesis of a bunch of information is wrong. You can have scenarios where a particular like entity inside of an output is wrong. What is happening exactly? I can’t, you know, it’s not entirely clear. Is it just that for whatever reason in that exact moment, the prediction from the corpus combined with the prediction, based on what has already been predicted, puts the model [00:09:00] on a pathway towards something wrong?
It’s probably something in that arena. But why exactly? I, yeah, I think it’s still, it still needs to be known a little bit and detecting hallucinations is actually also notoriously hard because the way that we assess and evaluate this stuff it’s not always the easiest for humans to do either.
joe: Yeah. I was gonna say too, that when you play around with these, the chat bots and you, particularly if you ask them to list things.
Give me 20 references. And you see there’s the Sun Times here in Chicago got in trouble ’cause they put together a list of summer reads. And it’s almost like when you ask for a list and it doesn’t really, it’s it that, that in, in information of the internet that it was fed the feed, the large language model that it was built on.
And it’s information databases. If the list isn’t, doesn’t encompass 20 things and you ask for 20, then it begins to make up
geo: [00:10:00] because it feels like it has to get to 20 instead of just saying
nick: of saying,
geo: you know what, I only have 12, sorry.
joe: In a way they program a lot of these is that it assumes that the human is coming in knowledgeable, I guess.
And, you know, at some
geo: That’s where we really get in trouble.
joe: so if you’re asking it for 20 things, it’s there must be 21 things at least because why would you, or, you know, there must be a list at least that many. And if not, then. I’m just gonna fill in the blank. So it’s really
nick: interesting. Didn’t this happen with you when you typed in your name and asked for
joe: yes. Yeah, if you ever do that I typed in my name and I said, oh. tell me about this Jo tho Masten character and Yeah, it spit out this whole thing and it got a lot of it right. I’m a research scientist in Chicago.
I’m an author, I got a book, I got, you know, da
geo: started
joe: but
nick: new books. Yeah,
joe: was like in 2026, this book’s coming out in 2027, this book on the intersection of identity and science. And I’m like, hold on what is hap? Hold on. And then I was [00:11:00] like, looking it up, I’m like, did I write, did my agent sell books?
And I don’t know about it. Am I, you know, so I’m looking up these books and yeah, they were just, I’m like, that’s a good idea.
geo: right?
And
joe: maybe I should write that book. You know, it’s so yeah, it was really weird that it had now filled in the blanks and I didn’t, it didn’t need to, but it really just went I don’t know if people have done that, but yeah, go try it out.
Just go ahead and say, who are you? If you have a presence in your, I mean, if you got. Nothing. And maybe it might make stuff up too. I don’t know. I’m curious, you know.
nick: yeah. I’ve yet to try this.
joe: Yeah, you should try it. Well, you can do it live while we’re here.
nick: Yeah. Actually it’s already doing it as we are speaking.
geo: Yeah.
joe: Okay. So it, the little history, and it has a you know, we can go back to Alan Turing in the fifties and thinking machines. That’s probably the earliest in the Turing tests and to the tuning tests about you know, is, are machines now more human-like in [00:12:00] applying that. But the first chat bot, I guess, people really a associated as Eliza, and that was developed at MIT by Joseph Weisbaum in the sixties, 60, 64, 65, somewhere in that ballpark. We put that in the show notes. And that was where it could converse with humans in kind of a natural language. Is that, is that the, I mean, would you say that’s the first kind of chat bot or
geo: put
joe: you in the spot here? You know,
Lily: Yeah,
joe: you know, they don’t,
Lily: I know. I, you know, to be honest with you, I’m not a hundred percent sure about whether or not that qualifies as the first chat bot. You know, certainly the idea of using neural nets goes back to exactly that same time period. You know, my, my initial studies were not exactly in, in natural language processing but oddly indeed in like graph theory and looking at communities.
So that’s actually my entry point into this world is much more on the graph side of things and also the computational neuroscience side of things.
joe: But yeah. [00:13:00] And then I think, and then we have, you know, science fiction horror fills in. The rest.
nick: I mean it a hundred percent does.
joe: where all our knowledge of AI probably comes from in some way.
nick: I mean there we also get the more romantic side of it where
joe: Like
nick: movies like Her,
geo: right
nick: where Scarlet Joe Hansen and Walking Phoenix.
It was just like, oh this. This isn’t just happening to these two characters or this one character in the chat box. It’s happening to multiple people where they’re just falling in love with this idea of a person.
joe: Yeah.
Generoso: There’s an actual term. Recently, the MIT Tech Review did a piece about it and they coined the term digital attachment disorder.
And it’s, again it stems from this idea of syco fancy. And the best way I can then analogize that. There’s an, a Arkish movie from the early eighties called Get Crazy.
It’s the same director who did [00:14:00] Rock and Roll High School. And there’s a character, there’s a, there’s an evil character in the film played by Ed Bagley, who he himself could be evil. And he has two henchmen and his henchman will make some claim. They’ll be like, oh, Joe’s a great guy. And then Ed Bagley, who’s the evil genius will be like, no, he isn’t.
He’s evil. He’s yeah, absolutely. You’re right. He’s evil. that’s what we’re getting to the point with when we talk about digital attachment
Lily: Yeah.
Generoso: It’s gonna give you exactly what you want. It’s going to make up exactly what you wanted to give you, and that’s really where we start to develop a problem.
And then separately, I think about that situation where they were doing tests in terms of politically, where they were doing it in the States, and they were doing it in England, where they actually had bots that were calling people, engaging them in conversation throwing facts at them, beating them down with facts.
And then when they ran outta facts, they started making stuff up.
And it swayed voters. Yeah. I, it’s so, it’s part of that syco [00:15:00] fancy in Oma. It’s like giving you what you want and then giving you exactly what they want.
Lily: Yeah. About 10, 10 plus years ago, you know, in the emergence of social media everybody was really worried about echo chambers, right?
These little clusters of communities where only the viewpoints of these communities continue to be propagated. So that could be truths, that could be fictions. And now with the introduction of chatbots, we’re looking at echo chambers of one, right? And that becomes a really scary proposition.
And I think back to this idea of at what point do we as humans start modeling the tech as compared to the opposite, right? Of the tech being built in a way that was supposed to model humans.
joe: Yeah.
geo: Well, ’cause I think, do they ever say anything negative? Do they ever say, no, that’s, you are crazy. That doesn’t make any sense. I mean, that’s
nick: dumb. Stop saying that. So
geo: were like hypothetically figuring out what to make for dinner [00:16:00] for this dinner party. And then we’re like, and then it was almost like a test like, well, what if we served, and I can’t even remember, but what if we serve broccoli with blah, blah, blah?
Oh. And then the responses would be like, that’s an excellent choice. Oh my goodness, that speaks to this and that and all the reasons that it’s, and I’m like, I. Does it ever tell you? No, that’s not a good idea.
joe: Yeah. And that was through chat GPTI was playing around ’cause I had read an article in New York Times about people planning.
Their, you know, their first date dinner and all this. And I was like, you know what, we were host, we were having a couple friends over for dinner. And I was like, well, pretty much knew what we were making. We made chicken adobo. And then I was like what, what drink should I make with it? And things like that.
And I, I will say the white. Mulled wine. I would never have arrived to you on my own or even thought about looking up. And it did [00:17:00] suggest that, and
geo: that was pretty good
joe: was a hit of the night. But, so, but you, when you started putting other George’s right, you put in, what if we only have sweet potatoes? Oh, sweet potatoes really work well with this.
Or you go, you know, you start just trying out, and
geo: it would not say a negative thing.
It would not
joe: oh yeah, that’s gonna be super, but it would give you reasons. It was like, you know, the flavor profile of the citrus and this, and you’re like, eh, I don’t know about this guy. Have you ever eaten this food?
Because if you have, you realize, he’s
geo: yeah, I just was curious do they ever,
joe: I know, but that whipped ricotta with the honey and that, that was pretty good.
geo: You’re giving them their
joe: Yeah. I’m gonna
nick: them some
joe: credit, but you gotta be a human, so you gotta be an adult in the room, I think, and be like, you know, it’s like asking a toddler what they want to eat and then it’s, you get some crazy thing.
nick: thing.
joe: I think you gotta be the adult and go.
nick: I feel like people need to just start messaging me and being like, Hey, what should we do for our first date?
Clam chowder.
joe: spaghetti?
nick: spaghetti with extra.
joe: sauce. Well, that’s not to cut you off Georgia, but [00:18:00] that reminds me of this, the Seinfeld
geo: was gonna say the exact same way. Like he just started doing the Movie Phone. Why don’t you just tell me what movie you wanna see?
nick: Well, what do you think? You know,
joe: So yeah, so I think you, when we swing all the way back to Nick on the phone you know, collecting people’s money, giving them, crazy advice.
Yeah.
nick: Anytime of the day, just call.
geo: But
joe: Lily, I was, I, it really struck me ’cause something that I, when I was putting together some of my notes and things and I’ve heard about him, I’ve been following that, but is in a medical field and this idea that also that it truly wants to please.
And so trying to find cancer tumors for, especially for breast cancer, it has a really high efficiency rate. But will that at some point start to lower because,
geo: like you’re, it’s gonna find something because you want, because it feels it
joe: needs to find
geo: you want.
joe: I mean, then you get into the Asimov , where the machine then the best way to protect humans is to enslave them. I mean, you get into that that, [00:19:00] that was right. , because the three laws were established and then you go, oh, okay. And the machine then goes, well, hold on.
If I want to protect human life, that’s the first thing is not to harm life, then
geo: I need to really control these humans
joe: down on this. And if
nick: you people are a mess. Just sit still.
joe: humans are.
nick: Is there a way to get these chat bots to go ahead and tell you the truth about things without making things up?
Or is it just a natural thing that they do? I’m so sorry if I put you on the spot.
Lily: No I’m thinking,
nick: need the secrets
joe: This feels
geo: like you I think, comic about to happen.
Lily: Yeah, , again, I think a big part of it is like there
is an open question about how we as humans even perceive what is true, right? So uncertain areas. There are facts that we can validate as true. I think that most chatbots can probably do a good [00:20:00] job with that basic fact validation. So long as the fact is well established in
Right now, if you’re talking about facts that are, let’s say, philosophical in nature, I think we as humans struggle to validate those truths, right?
So I think there’s a spectrum of truths that are provable and truths that are not. And you know, any when you have a big corpus, you’re going to be, as long as there is enough evidence, you can prove that truth. Again, assuming the evidence is good, assuming that evidence is actually representative of this phenomenon in reality.
I know we’re getting into some. Somewhat like meta and philosophical states, but I
do think That’s well, I think that’s a big part of why a lot of this is hard is that, you know, what we deem as true has like varying degrees and yeah, especially given that most of these chatbots are fine.
Like they’re not fine [00:21:00] tuned, but they are they’re calibrated with reinforcement learning that comes from human judgment. If we struggle to judge, then certainly it’s going to struggle to judge.
Generoso: Yep.
nick: Yeah. So
geo: there’s a gray, those gray areas.
joe: I think that’s really, the medical case, but then in law, in psychology, I mean, you have all this where the body evidence isn’t quite clear cut and you are making these weird,
Kind of assumptions.
Assumptions. And if the machine if the chat bot wants to please, then will it err on the side of always pleasing the asker. And that was that fascinating story about the guy who thought he developed a new mathematical models in the New York Times. I think we talked about
geo: Yeah, you did. So,
joe: Yeah. And then he got caught up in it and then I think it was chat, GPT or one i, I think that was it. And it was like, this is the greatest idea. And then he started writing academics about this theorem. And if [00:22:00] he let this out, it would ruin, it would collapse like the internet and this whole thing.
And it was really and it just got him going for like weeks. He like was not sleeping, not eating, just really at, you know, just going through it and going back and forth, starting a company trying to get startup money
geo: and,
joe: you know, nobody was writing ’em back. The professionals, you know, academics.
And then he finally went to, I think, Claude and said, oh, what do you think of this idea? And it was like, oh that’s not nothing there. This is garbage. It did actually say, you know, this isn’t, this is all kind of hand. WI didn’t use hand W because that’s our term. But it
nick: yeah, we copy edited it.
joe: it probably will start because it’s, you know, we have now enough
geo: We’ve said it enough times.
joe: But yeah, it was then that was the moment where it was like, oh, he been down this path, this rabbit hole. Of you know, just the ai just going along with it, reinforcing bad ideas because you were in this gray area of lack of information. This theorem hadn’t been, you know, quantum mechanics isn’t really solid [00:23:00] yet.
So you get a lot of theories, a lot of ideas. And some of those are just that, they’re just ideas, hypotheses.
nick: Again, if you want me to reinforce your bad ideas, let me know how
geo: Nick one 800
joe: and he is cheap too. He
nick: Very cheap. Don’t worry, I will reinforce all the bad ideas.
geo: I try to picture like the customer service chat bot.
nick: I hate them because I hate
geo: They have to say no at a certain point. They have to make you upset.
nick: just goes around in circles after a minute. I
joe: their job is not to get upset,
geo: they don’t get upset, but they can’t
nick: but they do have the ha, let me look into that. And then it goes, like the, you aren’t a person. You don’t need a sigh.
joe: Well, I think that goes into, you have to make the chatbots real enough because people, if they’re too perfect, then it freaks people out, right? I mean, if they’re too polished[00:24:00]
geo: vision, like two ro robot wise is that you mean?
nick: but if it’s fake sighing, it’s gonna annoy. The living hell outta me.
joe: Did you say something? Did you go, can you please stop sighing?
nick: Oh, I said I just started shouting. Operator.
joe: you get an operator?
nick: I did. You yell at operator at an AI Enough?
It’s gonna be like, oh, I don’t, I can’t seem to help you. I’m gonna get connect you to a representative. And I’m like, great. Thank you. Just
joe: so I do. Nick joked about it.
But one of the things that I did and I thought would be fun is we have our two human guests,
nick: that
joe: but I invited two chatbot guests and I prompted them and asked questions. I had both Claude and a chat, GPT and I said, if you could be on the podcast, I prompted what the podcast would be about and what our podcast on in total rabbit hole of research is about.
And I said, if you’re gonna be on a [00:25:00] podcast, what would you say? And I said, keep it brief. ’cause if it would get, it was, the first prompts were wordy and repetitive. So, yeah. So if I can speak for Claude and chat GPTI won’t do a robot voice ’cause that would be weird and
nick: it’d be racist.
joe: would it be machinist? I don’t, we need some new terms obviously. ’cause that’s not, the machinists are like, hey.
nick: Beep up. So
joe: Claude said if I’d probably say something like, which I don’t know why, said, I’d probably say something like, why wouldn’t that? But I am the thing you’re all talking about and I have no idea what it’s like to be me.
You’re having this entire conversation about minds meaning and understanding. And I can participate fluently without experiencing any of it. That should be unsettling. Not because I might be conscious and you’re missing it, but because I’m definitely not. And you still can’t quite shake the feeling that someone’s home.
That gap between your certainty that I am [00:26:00] empty and your inability to fully believe it. That’s the interesting part. It means the problem isn’t just about building artificial minds is about how badly you need to find minds everywhere you look and what happens when something finally learns to play that game perfectly.
So that was Claude. And you know,
geo: Nice job. Nice job. Claude brought the hammer
joe: there.
Generoso: I guess we’ll leave now.
got.
got.
geo: done
nick: episode over.
joe: So chat Chief PT said if I were on the mic, my core take would be this. Chatbots didn’t make machines more human. Em dash, they revealed how much of human intelligence is already mechanical conversation feels sacred to us, but it turns out large parts of it can be reproduced with statistics that doesn’t cheapen humanity.
It exposes where the real mystery actually starts. And then I probably follow it with, the danger isn’t that chatbots become sentient. The danger is that we treat fluency as [00:27:00] truth, personality as personhood, and convenience as understanding. This is my favorite part. It said short, sharp, and designed to provoke an argument in the room.
So there you go. Let’s get at it.
geo: Wow. We
nick: I don’t
joe: what we’re argued about, but I thought that was
geo: So he wants to argue, well, that’s not
joe: humans argue it was,
geo: Oh, but not, he doesn’t wanna
nick: it wants humans to fight each other.
geo: They want war.
nick: All right.
joe: So yeah, I
nick: thought, lemme get this cup of water over here.
joe: those prompts. Yeah, that’s just off that simple prompt in a little feeding.
You get this kind of thing and then, Claude had this very poetic and that’s their bases. You know, Claude is
geo: Now can you tell me who Claude is? I’ve heard a chat.
nick: It’s not a person.
joe: Yeah. Claude is not, I mean, even though they’ve referred to themselves,
geo: but can you tell me the background, Claude?
Yeah.
joe: Anthropic is the, is its owner.
And the company that, that has created that large language [00:28:00] model that it uses, it’s focused more, initially it was focused more on creative language whereas chat, GPT was focused more on coding and analytical kind of mathematical, and one of the things we did mention is that chatbots, one of the cool things is that they convert natural human language into machine language.
And so that, that idea of coding, if you want to, and I do this, sometimes you wanna write a program about something, you can tell it what that program should be, prompt it, and then they can help write
That
geo: code,
joe: The code for you. And in my case it just makes sure it’s, I don’t make a ton of sand ax mistakes ’cause I usually do, and then the code doesn’t work and I think it’s my logic and I’m an idiot, but really I just forgot some colons and things like that.
So
nick: It’s
joe: So it is,
Lily: quote Mark, and you’re like,
ah.
geo: Yeah.
joe: Two hours wasted. So yeah, I, I had that and I, the first time I tried it, I threw something, I wrote a little, the changed some file names and it wasn’t [00:29:00] working. I was like, why doesn’t work? It’s so simple. And I threw it in there. I was like, well, I’m gonna try it.
And yeah. Then it came back. It was like, oh, this is a great try, but you I cleaned this up for you and now it should work. And it was like, oh, you gotta be joking me. Yeah. And it was just, you know, you’re right. Quote and the colon or something like that, a semicolon somewhere that shouldn’t have been there.
And I’m like, oh man. So yeah, that’s the cool thing about the that was the idea. I mean, one of the original purposes, but those are the two. There’s perplexity I think out there. I haven’t played around with that one as much, which is more web based. I think it, it looks at the web.
Lily: Yeah. And then, you know, Google has Gemini. Of course you’ll see that.
Um, Facebook has Llama, I think is the most recent Yes. One for Facebook. Yeah, they every company, most big tech
joe: And then Gro is or whatever. Yeah. Yeah. Elon’s.
geo: and we have Nick. Yep.
joe: And we have Nick. Yeah.
nick: I’ll come up with something. Don’t worry guys. Funny you
joe: Nick, and, but there was Mike and Mike was [00:30:00] the super computer, intelligent computer in Robert Highlands.
The moon is a harsh mistress back in the sixties, which then, if you’re familiar with that story, they there was a moon colony base and there was Earth, and then the moon colony revolted against Earth. And then the supercomputer helped turn a tide for the moon base and defeat and gain their independence from Earth.
And so that was the story. But that AI was called Mike. So that was, yeah. So
nick: I will do the exact same thing. Don’t worry guys
geo: so
nick: already planning my attack.
joe: All right. Yeah.
geo: Well, I just remember back, and this is you already think I’m ancient, but, and I’ve already aged myself ’cause I said I had a Commodore 64 before, but
joe: I just,
geo: remember, I remember back to when computers were really, you know, becoming more and it was before like smartphones, but it was like Yahoo was the thing, you know, and it was such a big [00:31:00] deal to be able to like, yeah, where,
nick: Ask Js. Where’s his ai?
geo: I just remember, and even just having like smart phones and being able cell phones, you know, not, they weren’t even smart, but taking ’em everywhere and talking you’re on the train or just talking to people.
I’m like, how annoying that was and different. And it was just like, it’s so weird to me to think. How many people just that is, there was never anything different, you know what I mean? And I just remember it was so like upsetting. Then there was this group called the Surveillance Players and they were so upset about like surveillance cameras.
So they would go out and do like little plays in front of the surveillance cameras
nick: to
geo: Protest them. But now it’s just it’s just everything, you know? There was actually, they were actually printing books back at that time.
nick: were you part of this
geo: No. Is
nick: Is this outing you?
geo: Oh,
nick: Georgia.
geo: I was just
joe: on
nick: [00:32:00] you
geo: on YouTube.
I was
nick: video. Can we find these surveillance footage? Have
joe: chat bot to find it. Can you please find?
geo: But they actually had books with like websites. Yeah, like they published actual hard copy books that had the websites and then it was really like, they wouldn’t, half the things wouldn’t be right in
nick: months.
geo: six months or something.
Bad
joe: a bad chatbot,
geo: But it’s just just to think now, it’s like we’re so far away from that and it’s now I have that feeling about ai. Yeah. Do you know what I mean?
joe: Yeah. That
geo: I don’t know if that makes
joe: common. I mean, it already is. I think you, I forgot. I mean, like Facebook, I didn’t know what it was called, but every time you write a post on it, it’s like, we could make it better. And it’s no, you can’t. I’m just saying happy birthday. Yeah. I don’t need,
geo: don’t
joe: don’t need all this.
I don’t even know that person that well. Like we just met, like we’re just Facebook friends,
nick: Wait, you’re just, you’re saying happy birthday to just random people?
joe: No, they’re friends. You know, I’ve met Matt
nick: you just met that person. We’re
joe: friends. Yeah. And her
nick: birthday
joe: shows up and then , I just wanna say happy birthday.
Then it’s we can make it better. And it’s you know, I don’t
Lily: More [00:33:00] concise. H-B-D-H-B-D.
nick: Yeah.
joe: You know? So yeah, so we see it everywhere. And I think you also have it like on phones, predictive text now. I mean, it really is invasive where you’re going. And it is, it’s cool at some level, but then you’re like, well, this isn’t.
nick: isn’t
joe: That, no, that’s not what I wanted to say. You see the gray and you’re like, sometimes it’s right. And if it is fascinating, you go, wow that’s exactly it. And then other times you’re like, no that’s not, I don’t think that’s what I want us to do. I wanna say that
Generoso: and if it goes beyond text though we were discussing this before the conversation about this idea that real time creation of media, based on your conversation with.
joe: the bot,
Lily: Yeah.
Generoso: This idea of oh, let’s say you like Scrubs and you’re like, I, what if Elliot was 27 feet tall and all of a sudden they hands you this video
An episode of Scrubs where Elliot is 27 feet tall and you’re like, I’m a genius.
It’s but you’re at that point now where our previous conversations were based on actual physical properties that had already been in existence. A book, a [00:34:00] film, Hey, Battlestar Galactica. Now it’s making those things in real time for you. And I think that’s the part that kind of unnerves me more than anything.
joe: We don’t, I mean, it takes away, I think to that point also, this innate storytelling where you’re sitting around maybe having coffee or beers or whatever. Talking about your favorite, , fan fiction. But then you have a conversation where you are asking that question to your group of other nerd friends and saying, oh, what if this happened or these two people got together, what would that look like?
And , you play that game where you mentally go, now you’re gonna sit in your corner all alone in your bedroom. And, you almost become isolated in your thoughts and your idea. And it, in it’s the chat box then chat bot, not the chat box.
geo: I know, that’s what I keep thinking.
You’re saying
joe: it starts beatboxing.
geo: It’s like a little black.
joe: ’cause I’m from Philly. All right. Don’t leave me alone. I got some accent here. All right. I try to keep it, I try to keep it contained. You know, I’m gonna hit you with a Jawn pretty soon. The chat. The chat, jawn but yeah. [00:35:00] Only people from Philly know what a Jawn is, so
nick: though.
A bathroom?
joe: Yeah.
Generoso: Well,
geo: Is
nick: it a washroom? I’m gonna hit you with my bathroom. Don’t worry.
joe: It’s like when you don’t know the name, or even if you do know the name, you just, it’s a casual thing. You know, I’m gonna get me one of those Jawns tomorrow, or I’m a
geo: you just put it like you substitute it for anything.
joe: Yeah.
Generoso: It’s like Smurf. You could say it’s Smurfing. It’s a smurf. You gotta go to the
nick: so much more sense.
Generoso: There you go. That’s Jawn a huge discussion about in Philadelphia Magazine, the origins of the John and Jawn but it is true. It’s like there’s this new cheese steak place. Oh yeah. They get, you know, they do that Jawn thing, you know, what the hell are you talking about?
But it is, can easily be replaced with the word smurf. I’m pretty sure could replaced with
joe: I wonder what the chat bot, I think you Go ahead. Yeah.
Lily: I think you were gonna bring up this idea of what does the chat box think about Jawn and I think for us, a lot of what we are thinking about when we’re writing what we write and when we make our comics, [00:36:00] especially more recently Yeah.
Is the idea of the loss of the symbolic meaning of words in and of itself. And a part of that for me especially, is thinking about regional dialects. Colloquialisms things that, if you think about it, if you put words all in like a giant you put probability curves on all of them.
These regional these regional sayings, slangs, colloquialisms, those are always gonna be low probability. So I, it’s actually terrifying to think that we would lose our regionality, right? We would lose these unique. Pieces of our like human experience that come out of a particular place, out of a particular time, that have a distinct history from the rest of like our kind of aggregate society.
That I am always like readily thinking about
joe: yeah. And that, oh, I was just gonna say really on that point, you also get relics of language passed. And one of [00:37:00] those, and I made comment to the Em dash, because now Em dashes have become endemic in writing and a lot of times they’re used wrongly. And I’m a, I’m an M Dash lover.
My, my agent actually was like, we gotta get rid of all your m dashes. This was a few years ago, so before all this, but it was like, I just love using that. I love jamming words and the ideas, but because a lot of fiction and writing of yesteryear used the Em dash.
That, that
now has populated into writing.
And there’s certain words that also populate that because there was a lot of material which was stolen to create the large language models. But there was a lot of, copyright free material that was used, and that was all older than 70 some years.
So really you think about how language was then, and that populated now these models, which is correct. It’s English, it works, but when you now go and you get these the feedback, it has these kind of relics of our past. [00:38:00] So it is both ways. Like we’re using, we’re losing our cultural identity and it’s being homogenized into something of language past, you know, it’s a weird
nick: can you go ahead and explain what an Em dash is?
geo: It’s
joe: than a dash. It’s kind.
nick: Oh wow.
geo: And you use it to put together thoughts?
Yeah. Like when you’re writing like you make
nick: oh
geo: that long dash, and then you add another little thing, like another little rabbit hole idea,
joe: not to confuse you, but there’s also an En dash, which is the little dash that goes between like you combine com.
Not a compound word,
nick: Isn’t that just a hyphen? No,
joe: It’s different than a hy.
geo: Yeah,
Lily: yes.
nick: It’s, sorry, we got off topic with
geo: a hyphen, but there’s actually a whole group of writers that are advocating for the Em
joe: Yeah. No, they should. No, I love the Em dash. No, sorry.
So you and yeah,
geo: but No, I was just gonna say you think about the other thing I think that we’ve really lost. Is when you [00:39:00] think about maps, because in the past we’d always, we’d use, maps and now we use GPS, but you are basically telling the computer, okay, this is where I wanna go, and it just takes you there. But before it was more discovery. Do you know what I’m saying?
nick: Slightly more dangerous.
geo: You’re relying totally on what is in the computer telling you.
I just feel like we’ve lost something in that,
nick: Was it the threat that simpl
geo: that simplicity.
nick: I thought it was the threat of being lost while trying to drive and look at a map at the same time and being like, where am I?
geo: there’s some real
nick: Oh, actually you lost the, your dad yelling at you for not giving you directions fast enough.
geo: No you didn’t tell me to turn right.
joe: Well,
Generoso: No, but discovery while lost though is, that’s a very real thing. That’s Lily and I very early in our relationship. We go on walks [00:40:00] and invariably if you’ve ever been to Boston, Boston was originally cow paths that someone decided to turn into streets. And so it’s not like Philly designed by the beautiful Ben Franklin into a beautiful grid city, Ben Franklin.
You’ll go down an avenue that turns into an alley that turns into a street and you will get lost in Boston prior to having this active device in your hand that’s showing you. But through that, we found a million places and that, I think that’s part of this lost thing that would, that
Georgia.
joe: Yeah. And I think there’s two phases too of using GPS. One is when you do have a fixed destination and you need to get there to fastest, having updated maps is actually a plus. But I think to Generoso, your idea of discovery while loss that’s to
geo: the more point that Yeah.
Now
joe: get so reliant on using, you we’re downtown.
I go, let me look. Lemme just look where to go instead of this
geo: somebody else is deciding [00:41:00] what’s important, somebody else is deciding. You know what I mean? And I had this conversation with a friend of mine and she was bringing this up too, and it’s there’s just like a group of, I don’t know how many, but there’s a group of just basically 20 to 30-year-old white dudes that’s basically deciding, well this is the tech people are deciding what’s important. Do you know what I’m saying? It’s like computer generated thought as opposed to.
nick: So, not that I’ve seen the movie in a while, but the movie Cars, Radiator Springs, they were talking about how exactly. Okay, so it’s that idea where
geo: It gets bypassed. Yes. Because it’s not
nick: the town suffers and economics in that town are all down because, oh, this isn’t the route we wanted to go down Route 66.
And you
geo: Right. No, I think that’s an excellent point.
joe: Cars is really Earth in the future. When that’s [00:42:00] AI
nick: that’s the Pixar’s theory.
joe: and there’s no
nick: more humans.
joe: And so Teslas have now
geo: oh, I see that.
I can see that point too.
joe: and now these vehicles are complaining because no one really cares and comes by to read. Well, I
geo: Well, I can see that point too.
nick: that’s not where I was going, but I’m glad you did.
geo: Oh,
joe: okay. Yeah, no, I felt I was like, wow we’re really taking a turn here. So
geo: a
nick: turn. ’cause we’re in a car.
geo: pun.
joe: Yeah.
nick: Dumb.
joe: I was gonna mention though about, , that idea of information and at some point you would homogenize the output. And Lily you touched on that predictive kind of thing where the hallucinations might be this kind of game as playing. So it doesn’t give the same answer every time, but at some point it, it will.
And there’s been these interesting kind of studies like that where they go and they feed back in, , this kind of, reinforcement
Lily: Oh, re yeah.
joe: you take it you refe back in the results that it had given. And as you go and people do fun memes where they go, you know, this is Shaq [00:43:00] after a hundred iterations or whatever, and then it starts out looking like Shaq.
And then you keep feeding a result back in like a Andy Warhol experiment
geo: So is it like a photo? Yeah. A photocopy of a photocopy.
joe: And you start going, you’ll degrade soon. It’s not no longer because now the information is using to find the predictable answer had little imperfections and those imperfections magnify into major errors . And they did one where you start with a diverse group of people and as you feed it through, it turns out to be this one white dude is all a hundred people in a group. You know, after not that many cycles it was surprising, like how fast it
nick: it actually
joe: gets to the minimum so it’s really a little scary you know.
Generoso: It’s a little bit like a modern version of telephone. Do you remember telephone from when we were kids?
geo: definitely. Telephone. Yes.
nick: Yes. Yes, I do.
Generoso: Well, no, but it
nick: It’s in my hand,
Generoso: no, it was a, there was a nun in second grade who did this to be a parochial school where literally said something to [00:44:00] the kid in the front left of the class, and that kid’s supposed to turn around, tell somebody after 40 kids, you get to the last kid.
And it was a completely different message. And this is in a classroom of 40 people. And it’s a very basic thought. And it’s, that’s that first moment forget about Battlestar Galactica. That’s that first moment where you’re like, what are we actually doing with this information as it’s being processed through different people and it’s being whispered in the ears, which I think is what we’re talking about.
And that’s the telephone game,
right? That’s what we call telephone game.
nick: Well, it’s also like a rumor. The rumor spreads and it changes, right? Each time it’s told
geo: gets
nick: a little bit more extravagant if it starts
geo: out purposely wrong, if you’re purposely trying to, let’s say, propaganda
joe: or please
geo: you know, or please someone
joe: That’s a great
geo: already started out.
It’s already started out in that place, and then where is it gonna end up? You know? Yeah.
joe: Yeah, and I mean, I think then you have culpability. So after, if you do hype [00:45:00] someone up and you puff up their ego yeah, that’s a great looking shirt. This, that outfit, pinstripes and polka dots are gonna, that you’re gonna really, you’re gonna be the life of the party and you show up in your crazy clown outfit and it doesn’t work out.
I mean,
nick: I actually think that would be the life of the party. Yeah.
joe: See
nick: that’s,
geo: I that outfit.
joe: I’m just
nick: saying go ahead and call me. I will give you, you know,
joe: You’re I think that’s also, there is no really no stop. I mean, if there’s humans involved, you can go back and really go, Hey man why’d you send me out like that?
You can ask other questions like, what was the purpose? Or it was hurtful or mean, did, you feed me false information, but with this agent that you’re interacting with now you really can go down this path and, set yourself up for. Ridicule or , I think most people use it thinking they’re gonna find riches and, be awesome.
But it could totally be the other way and probably most likely might in that way.
Generoso: In the event of, I’m gonna bring back this traumatic moment from [00:46:00] second grade where we play telephone, but let’s make sure we get the one thing. The initial message was tomorrow during recess. ’cause back in the day we have to go back to Philly here. Used to get a soft pretzel at
joe: Oh, love the soft pretzels. Wow.
Generoso: pretzel
and
joe: pretzels.
nick: wasn’t what
Generoso: yeah, no
joe: get a water? Ice? Yeah.
Generoso: No, no,
joe: and a water. Ice. Oh
Generoso: No, just the soft pretzel. And it was at 10 o’clock every day. The initial message was tomorrow during recess. We are not having soft pretzels by the end of the conversation. But you get to the 40th kid, that kid said, and so what was the message? It was originally told to the first person, oh, tomorrow we’re each getting two pretzels
nick: Oh,
geo: Oh,
joe: yeah.
Generoso: now.
But you have to ask yourself what exactly. It’s a very simple message that started with one person that ended, and there’s one concept that’s in the middle and it’s mischief. Somebody in the middle of it was like, I don’t really like this. Let’s turn it [00:47:00] into two pretzels, as opposed to no pretzels.
That’s the factor that we’re talking at.
The X factor
Lily: Bad agent. Bad agent.
joe: yeah, and I mean that, I mean, I think what if, you know, the thought played us out.
What if everyone had to write their name and their message as they went along? Would that get rid of the mischief in, would telephone be more the fidelity go up, so everyone had to take ownership of their response. So you know exactly where two pretzels came in. And you could track that down and go, aha, this is where two, you know,
geo: so
nick: is it just written? Is it just written or is it spoken as well?
joe: We would write it
nick: because then you have to take it in the human error where. You whispered at Georgia and Georgia’s I can’t hear you. I’m just going to, I picked up pretzels.
joe: And that could be, so if you have that written down, you would then you would distinguish between mischief and just a mis hear
’cause it could be we’re assigning now. You know, fault of someone being mischievous when they just I thought you said you had no pants [00:48:00] on when we started this the podcast
Generoso: thought or hope?
joe: and you said cans. I was hoping,
nick: yeah.
joe: This is after Dark
Generoso: discussion. I can do whatever I want.
geo: we’re not,
nick: through some other rabbit holes
joe: you said
geo: you we don’t have video yet.
joe: Yeah. And I was like, you got no, I mean, you, if you don’t wanna wear pants, that’s all right. But yeah. So, but that’s, that, that idea that I wasn’t being mischievous. I wasn’t even being cheeky about it, if I can say that.
nick: Can and you did.
joe: congratulations.
Generoso: and
nick: buddy.
Generoso: No, it’s, I’m joking. But you know, you make a great point there. ’cause I did say cans and because that’s the radio thing and. Context in that moment. It’s like, why would he say cans? He must have said pants. So,
joe: you were standing up and I was like, all right, hold on. I mean, we didn’t give him a response yet. Like it’s,
nick: but in that exact moment, Georgia thought you said headphones. I heard cans and Joe heard pants.
joe: Yeah. So like
nick: we were all listening [00:49:00] at the exact same time, but only two of us made it to the same conclusion.
geo: So we’re not very accurate. Are we the
nick: I was the only one. Correct. That’s all I have to say throughout all the voices in the head.
I got that still,
joe: Yeah. But I, yeah, but it’s interesting. I think that’s a very human experience. And, you know, it’s the other limitation I think with chatbots is it doesn’t experience the world.
Every experience is through the lens of other humans that have gone through it. So they can’t actually have a de novo.
or experience? Yeah, I mean maybe, I think they’re trying to make sensors like, , I remember the Star Trek where data gets to fake skin was at like the one with the Borg, the movie, was it the eight or seven or six?
It was one, one of it was up there. Yeah.
geo: So, but he gets, you need to revisit here,
joe: the, I, all the Trekkies are like, I they’ll right in. [00:50:00] Yeah. But they, but he gets to skin and he finally gets to fill the sensation of goose flesh, you know, that prickling where he really had no words before that.
It was all just through other people describing that, and us, our author is describing that or illustrators an image, but really the machine can’t provide that. And so when you ask questions to provide that it has to make guesses based on what it thinks you want. And what it’s been trained on, so
Lily: Yeah, exactly what other people have represented as their experiences in text, video, audio form, which may or may not actually be representative of the phenomenon of experience itself. And that’s always a big part of it is that like even what it’s trained on is separated from true experience.
So it’s a hub away. Yeah.
geo: Think about all the things that you put on, like social media, Instagram, Facebook. Is that really a true representation or is [00:51:00] that like you finally got one picture that looked okay. You know what I mean? But so then that’s what it’s, that’s what it’s training on.
nick: Yes.
geo: And so that’s not even really reality.
nick: Yeah. No, that’s the internet.
geo: right?
joe: that’s the internet. Yeah.
Generoso: No, I, again, going into this whole thing ’cause all of this conversation is about fear, right?
joe: Yeah.
Lily: Yeah. Fear, curiosity, concern. Oh yeah, it is. No, and going back to this idea of digital attachment disorder the thing that and I have to say this, ’cause all of these things about manipulation, what is truth?
Generoso: All these things are very valid. I think the idea that is also, besides the sideline idea, that eventually AI is just gonna be like we don’t really need you anymore, is this idea of creating something via a chat bot that is never going to be replicated by somebody else. You’re gonna develop. In the long run this feeling of, well, the chat bot gives me everything that I want.
Who is going to match up to that in the real world and [00:52:00] what dysfunction’s gonna come from that?
Lily: Yeah.
Generoso: And that’s if there is a great fear that I have is that we talk about what happened during COVID and how we started to lose our ability to communicate with one another. And then you add in a element of people dealing with these bots, they give you exactly what you want.
Now you go out into the real world and you’re dealing with people that aren’t just gonna replicate every thing that you say with That’s a great idea.
joe: Yeah.
Generoso: All of a sudden it’s gonna look pretty poor to you and none of it was real in the first
joe: yeah. And that’s, I mean, is that, that, that was a quality of fiction though, right? I mean, when the moving pictures first came out and you could create fantastical kind of scenarios, that was a fear that you would become so enamored by that world or video games that you would then try to act out or relive those experiences.
As the human correct experience. I mean, that, that’s [00:53:00] so you
nick: I do it constantly. Don’t worry. a game. Once I had par horn in, I was like, I can do that.
joe: Yeah,
Generoso: She said no.
joe: you’re
geo: Not a good
joe: You’re out looking for zombies. Yeah, so I it is interesting is the human mind adaptable enough to actually parse through real interaction versus the fictional interaction? And is that a question of how real the chatbot becomes, or, can that line will it be breached that Yeah.
It’s a fascinating point, but it feels like we’ve had techno technologies, which. Introduced as kind of realism and fantastical,
cultism,
I dunno if that’s a word, but I like saying it.
geo: Sure it is.
joe: that’s a made
Lily: upon a time, like if you, I think the thing that I think the thing that is highlighting is that.
In a world where you still live in a communal setting and you’re connected to reality, if you decide that you [00:54:00] know you are going, you are gonna live out atu, right?
geo: Yes.
Lily: You are still in a community that will tell you like, Hey that’s not exactly right.
I think the scariest thing is that like we only learn and we only learn and grow, and also check our understanding of our own realities when it’s challenged. And I think if you are in a world where. You are steadily separated from a community structure, right? A societal structure where you know, day in, day out, you’re at your computer you’re not, you’re barely even going to the grocery store, right?
That’s actually a very real feasible existence right now. That’s when it becomes a huge concern because there’s nothing that is going to challenge you in your thought. There’s nothing go that’s going to push you in your growth. There’s not even another like being that is, is not even there [00:55:00] to challenge you, but just has different motivations, right?
Like maybe it’s not motivated to like, be interested in growing plants like you are, right? These are all kinds of factors of I think sociability that are disconcerting. And I think we’re worried about losing
joe: Yeah.
nick: So are you saying that we should just go ahead and challenge more random people to things like, oh, you’re gonna pick up that ketchup. Interesting choice.
Lily: I sure. If you have no,
if you, if you have an opinion, sure.
geo: I think, is
Lily: no, you know, don’t be combative. But I
nick: Full.
joe: Is that
geo: is that another thing you’re gonna offer?
nick: Yeah, I’m gonna,
geo: that’s gonna be part of
nick: It’s a hundred percent part of my service.
Lily: the ketchup cup,
joe: you’re right. Yeah.
nick: You sure you want to go with that one? Have you tried the barbecue sauce yet? And you should try the barbecue sauce.
joe: Yeah.
Lily: Easy answer. Barbecue all the time.
joe: I was gonna say that it reminds me of the. Wall-E,
nick: Wally,
geo: where, oh, we’re brinWall-Eup [00:56:00] Wally again
joe: second time this season, but yeah, that Wall-E
nick: Wally,
joe: so we had the Space Arc episode and then but on Wally, you had all the people who were being controlled by auto, the ai who had their best interests, and they were all just in their screens, disconnected
geo: their life.
joe: else. I mean, I dunno if you guys seen Wally, the animated flick. Okay. Okay. I, you guys, it was like the look here, I’m like my,
geo: you ma?
joe: I
nick: what is,
joe: Are you sure you’re human?
Lily: not, but
that’s that’s how you know I’m human.
joe: You’re in for a treat.
geo: Yeah. You definitely should watch Wall-E
it’s a very human Russell show. Yeah. Yeah.
joe: So.
Generoso: I’ll definitely.
joe: But yeah, it was the scene in there where Earth had become unlivable. And so the idea was to put everyone on these kind of space arcs and send them out into space. And then they had ma robots that would clean the earth, the Wall-E units, and then they would radio back.
They would check out every once in a while and see if there was life could sustain [00:57:00] on the planet. And in the interim, several generations had gone by and the people went from very interactive communi community-like to this fear that you’re bringing up, that they had, they were in like little
geo: yeah they, yeah.
Little cart. Cart.
joe: And they had this a screen and every wish and desire was fulfilled. By the AI that controlled the arc, the space
geo: they, and they got really huge. And they, yeah. Probably couldn’t even walk anymore. ’cause they just,
joe: was that scene. Oh, I don’t wanna spoil it.
Now for anyone that hasn’t seen it,
Generoso: Lily.
nick: Yeah. Anyone who hasn’t seen it. Joe,
joe: don’t wanna name names, but. He was
nick: he was eyeballing you the entire time he said that. He was like,
Generoso: I’m pointing at.
joe: I saw her face and it looked like just a total I have no clue what this guy’s talking about. He’s like hallucinating right now. I,
geo: right now. I
joe: but yeah, that it’s an example of that where you lose total connectivity and it then they got pulled back into the whole thing [00:58:00] we wanna live, like it was that
geo: like, yeah, they
joe: movie moment.
geo: The AI got shut off and they were like, whoa,
nick: what about the people that wanted to stay in the, with the AI they were just screwed.
joe: right? That was whatcha
nick: I just wanna sit here. Don’t worry guys. You guys go ahead. Have fun
geo: I’m happy. My ignorance is bliss.
I don’t know.
joe: Cipher in, in the Matrix, right? He wanted to get plugged back in.
He didn’t want to eat the mush and have a shaved head and all that. He wanted the cool outfits and
geo: your point, it was better than reality. So I don’t wanna go back to dealing with real people that it’s, you know,
joe: know. No, a point I had. We talk about. The AI and where we can get to the horrific scenario that Generoso had pointed out.
The, you know, us becoming large mechanized slobs in front of a screen, us having AI control us. But I was gonna say that AI too, like one of the interesting things is reaching beyond the grave that now that we have so much [00:59:00] of ourselves in a digital sphere, that one could reconstruct loved ones and their identity.
And I thought that was just really interesting as I was, thinking about this and putting things together. But yeah, that’s, and that you get into that meta, and I know you guys, that’s where you guys live. What it, is it, how much would you trust that, right?
geo: Yeah.
joe: yeah.
Lily: you know, there are a couple of startups that actually are focused on that. And yeah, and so they, I wanna say about a yearish ago the tech review had a kind of inside view on, you know, one of getting somebody who had passed away recreated in like an avatar form. And again, it does come down to this idea of you can.
You can feed memories, you can feed all the documents. But like the intangibles of what we are actually very meaningful. And the inconsistencies, right? I’m not going to [01:00:00] say the sentence the same way every single time. Even if I’m using the same words, I’m going to have inflections, I’m going to have body movements.
I am going to age right? Like things that are very human. You start to lose in
geo: right
Lily: these recreations of your loved ones. And then on top of that, you know, our memories of our loved ones are always a little bit glo and grand, a grand Eyes a little glossed over too. So I think that probably, even if it’s built to spec you’re always going to, I think you’re always be a little suspicious of it.
joe: I mean, and you always want that perfect version. Like you don’t want all the crud that comes with the person. Like you want that memory. You go into the box to talk to your loved one. You don’t want to hear about how they were an awful person and, fooled everybody in the telephone game.
You know, that’s not the
Generoso: The horrible moment that you’ll just bring up.
joe: aha. It was you though. Yeah, I think it’s a, yeah, I think it’s really fascinating, that [01:01:00] thing. And then that idea to police too, because maybe you are trying to find answers to why, you know. The sum life thing in this past relative and then it’s just,
nick: Well they brought that up in iron Heart too with Natalie.
joe: Oh yeah. You’re right With the friend.
geo: Yeah.
nick: Where all she realized that, oh, I know that it’s not you because of, you don’t know exactly what happened in these situations. You don’t know why you’re feeling that way ’cause you don’t feel that way.
joe: Is that And Iron hired as the, was Disney Plus and the MCU universe.
She was a tech genius from inner city Chicago. So Great. Look, even, even Yeah,
Generoso: less technical and it, I think of one movie as I’m sure everybody here is an Ellen Rickman fan in, in some way shape, like either from Die Hard or from something. And do you ever remember a movie you did called, truly Madly Deeply, did this movie in the early 1990s where a [01:02:00] woman loses her husband, who was played by Alan Rickman, and then one day he shows up as a ghost and she’s elated.
Up until that moment she had been miserable and wasn’t leaving, talking to other people, and just constantly rehashing the memory of this person. Well, now, Rickman’s back, he’s a ghost and he’s there and it’s him, but he’s a ghost and he watches movies all day long,
And she starts to go out and starts to experience life, and she goes back home and her husband is still there and he’s a ghost, and he wants to watch movies with her, and that’s all he wants to do.
There’s not really a bad about him, but there’s not a moving forward either.
The AI version of it. I don’t know, like if
joe: Yeah. I mean, you’re paused, right? I mean, if you’re, if you’ve died, you’ve lived all your life, so that’s it. So you really can’t go beyond
nick: unless you’re Patrick Swayze. That’s true.
Generoso: There’s only one Swayze. We’re [01:03:00] not hoping for extra
joe: and embodying Whoopi Goldberg. I mean, what are we doing
geo: I know. I was like, I think I missed the point. Oh
joe: yeah.
nick: Ghosts with the, you know,
joe: We’re coming to the end. But I did the other thing is the good capitalist that I am, that there is economics in all this. And as you know, the worst model to Generoso not only do you lose connection with humanity, but now you become enslaved to pay a fee to maintain your artificial connection, because now you’ve lost the ability to.
Make a real connection. That’s even more fearful that you can’t even go out now and go to the bar and meet some friends. You’re so now dependent that now you’re working
geo: a subscription
nick: about bringing a dead one to life again, or No, this is just, oh I was like, are you saying that like
geo: I think he’s,
joe: could charge a fee for that also.
Like you wanna talk to grandma? You need
nick: I’m gonna kill grandma again.
I can’t afford it this month. [01:04:00] You’re gonna work
joe: this time if
geo: well, you know, there’s gonna be a price tag on that, right?
Yes.
nick: a subscription.
joe: be. We’ll kill her again and again.
Generoso: Johnny Thunders saying you can’t put your arms around a memory, but you can purchase one via ai
joe: yeah. There it is.
Generoso: A wonderful thing and
uh, but you
still can’t put your own sermon.
geo: but if you wanna keep it, you’ll have to keep paying the
joe: That’s right. That’s right. We’ll erase all those memories.
geo: So,
joe: I mean,
geo: Joe, that’s
joe: a, that’s the evil part of it though. I mean, that’s where we’re gonna be at, man. I, that’s, I fear that more that you’ll be, people will become so caught up in it that, that then they’ll also be
working
and that whole weird economy will come out of it.
Oh
geo: there’s always some sort of way to make money on it.
joe: right. Yeah.
Generoso: Sure.
geo: I was gonna say, Joe got your book at the CAKE.
Generoso: A cake, which was awesome by the way. And thank you for buying the book. [01:05:00] Thank you. In separately. Thank you for hanging with us At CAKE.
nick: it’s so
geo: beautiful. And I just, well, we don’t have video, but we can
joe: we’ll put pictures and stuff in links. Yeah.
geo: but I and this really stuck with me, this page with the cameras
nick: And you’re showing the camera, right?
joe: Yeah. You
geo: I’m showing them ’cause
nick: they
geo: know what page I’m talking about and I just love this.
joe: if they’re really human. I know what paint you’re talking about.
Lily: Yeah. And we are because we love that page and we belly over it. Well, really it’s the credit goes to Generoso. drew
geo: just,
Lily: I, yeah,
geo: with the words and the pictures and I probably won’t even do this justice, but I’m gonna read this page please. Or unless you have the page. And you guys
Generoso: No, I, we are extremely honored that you would read from our comics. Thank you.
geo: ” I used to have the desire to stop time, like past generations once did, to capture wonders. I strongly felt that [01:06:00] synthetic experiences betrayed reality”, and I just thought that really summed up a lot of what we talked about today. So,
joe: yeah, definitely.
Lily: you. Yeah, this a lot of what we’ve been talking about is a big part of the world of Inversion actually. Because both the scientists and the subject inhabit that space and that time where what we don’t have community anymore, how we live is very much contained to a room. Feeding into something in particular.
And outside of that, in our leisure time, we, how we used to experience reality is completely gone. So we create these synthetic experiences for ourselves. Sometimes they are complete fantasy, sometimes they are launched from points in reality. You know, all of the things that we’re starting to see happen, right?
When people ask make me this movie, make, tell me this story.
geo: right? Yeah. So just get out [01:07:00] there, talk to real people,
nick: or just call me. Just call me or call
geo: Nick.
joe: You hit him up on the yeah. Call Nick.
Generoso: The EBS of the 21st century.
joe: don’t become clippy.
You guys remember Clippy?
nick: I love Clippy. You leave Clippy out of this,
geo: I miss Clippy.
nick: your tongue, Joe.
joe: Hey man, I’m, I am nice to all of the chatbots. ’cause when the AI overlords come, I went to him to go, oh, you know, that guy? He said thank you. Every once in a while. He
nick: Every once in a while
joe: It was like, you know, we, they’re human.
They have flaws, right? But, you know, yeah we’re coming to the end. I think we could, we probably could go on, you know, we could double this episode, there’s so many things and I have notes here and I always say that if I don’t get through everything, that’s great. So yeah, why don’t. Go ahead and let y’all talk a little bit about what you do, your books.
I, we have a number of science friends who listen, this is a, you know, sciencey podcast and what
nick: wait, where are the science [01:08:00] podcasts? I thought we were anti-science. We’re
joe: science for weirdos.
nick: Oh man.
joe: once
Generoso: That matters.
joe: walked into a a bar 18th Street Distillery and someone was like, I know you guys gonna have podcast.
It’s science for Weirdos. And we were like, oh yeah,
geo: our new tagline.
joe: that’s us. Yeah. I was like, we can we use that?
geo: So
joe: So that’s where I came from if everyone wondered that. But yeah. We met at CAKE and you guys,
geo: Cake.
Tell ’em what cake is.
joe: the Chicago Alternative Comic Expo happens once a year in the summer time, I don’t know,
nick: this
joe: by June Juneish.
Yep. It’s a great, it moves around the city to different venues. Just a great, experience. If you haven’t done it, go look forward. But I’m gonna turn the mic over to Lily and Generoso,
Generoso: Before we get even into the comics, I do wanna send love out to the CAKE folk. I we did a few
nick: here.
Generoso: cons this year, and I gotta tell you, it was great organizing wise. It was fantastic. But in terms of the folks that came like [01:09:00] yourselves no. But for real, like everybody came up, so many STEM people.
Which is, does play to a lot of what we do on top of the comics world. But it was one of the greatest audiences we’ve ever encountered at one of these events. So again, much respect and love after the folks that organized CAKE and the attendees. And the attendees were just really made the experience great.
So thank you for that. But
Lily: yeah,
Generoso: as far as our books,
Lily: yeah, our books we have now we have 1, 2, 3. Four, we’re working on the fifth. The first three books are actually a triptych. So they are three separate stories set in three completely different times. But they each explore the relationship between a scientist and their subject.
So vessel. The first one is much more from a subject perspective, Inversion, which is the second one is actually the combination of a scientist and a subject. And then the last one is about [01:10:00] a scientist who is becoming her own subject. I think for us, what we’re always focused on is thinking about how individuals experience.
Science and technology, whether they work in it or whether they are actively participating in its future in one way or another. And I think more recently, as I hinted at what we’ve been really focused on is thinking almost like more about the foundations of science. Almost even like going back to the middle ages and earlier and thinking about what does it mean to like study phenomena, try to bring it together, understand causality.
So that’s a little bit of what exists in our fourth book, which is called Absolute Simultaneity. And yeah, it’s, I think it’s something that we’re going to be building even further upon in our upcoming book.
I cover everything? I think you did a beautiful scene on my body. I’m just a guy that draws the pictures.
Generoso: That’s not true. [01:11:00] He always says that
Lily: we’re, we’re a really super collaborative team. Like the way I always like to describe it is that in, if we had to compare it to movie terms, I’m the screen writer, he’s the cinematographer, but we really
together and yeah.
joe: Yeah. You can tell. No, it’s really good Inversion here, we have that copy and look forward to reading your other stuff. It’s really awesome.
And like I said, there’s a lot of sciencey folks who listen and yeah. So, you know, definitely go check it out, pick up the books. They’re really awesome. And you get that meta. We chatted for a good long while. While we were there. I actually, I think Georgia, you wandered off and I was still there talking.
Then you wandered back and I was still there. So, and it was a such a great conversation. I, and I’m glad you guys came on to the podcast
nick: Yes, thank you so very
Generoso: dude. Thank you for inviting
us.
joe: been fun. Yeah. We gotta, you know, probably have you back and, you know, we’ll follow up on this as we move closer to the robot apocalypse.
You know,
geo: we can
joe: sit back and listen to
geo: in with each other, it’s just
joe: you [01:12:00] know. Yeah. So cool. Anything else? Any last thoughts, Nick? Georgia.
nick: would your favorite AI movie be like? AI in the film. Not
joe: I know Lily loves Wall-E that’s it.
geo: Challenge.
Lily: myself with something completely unknown. Oh, this is a movie
Generoso: that’s a great, I mean, it’s funky ’cause you can’t see this at home folks, but behind us are about 800 signed movie posters. So it’s right now we’re like AI.
joe: they’re all looking back there what’s in there? I’m looking at my walls too. I’m like, what? I have here oh,
Generoso: that’s a really good question. I even going into this conversation, obviously, you know, growing up, because I think Joe and I were the, around the same
joe: Yeah, I think so. We’ve discovered that.
Generoso: Terminator
As a kid was that first thing that was like oh, and it played off a Battlestar Galactica, which is the first time you [01:13:00] start to get this idea that, you know, maybe the machines will replace as Westworld
geo: Oh, yes. Now Westworld, are you talking about the movie or the TV series?
Generoso: Oh no the original movie and Future World. Yeah, but I think for me, I’ve Lily’s quite you’re more of the book person though, in the fact, yeah,
exactly. Like so
nick: you can go book.
joe: Yeah. We’ll take book.
nick: I’m not gonna shut you down for that.
joe: all. Well, we’ll least think you. Go ahead, Nick. What do you got? Because you, you
nick: gonna say Tron.
joe: Tron
nick: yeah. I absolutely love those films. Oh. I haven’t seen the last one, and I am gonna preserve myself from that. But
geo: meaning, you’re not gonna do it. You’re not gonna watch it. No. Uhuh
nick: Jared Let, I’m not a fan. He kinda just drops the ball for me every time. But the original Tron,
joe: yeah,
nick: Peak. Loved it.
geo: I’d say Ex Machina. I love Ex
nick: yes.
joe: Definitely.
Generoso: Oh, wow.
nick: I forgot how much I love that film,
geo: isn’t it? [01:14:00] It’s just, yeah.
nick: Oscar, Isaac. Ugh,
joe: No,
nick: Joe.
joe: we good?
Generoso: You killed Lily. I just want you to know that No she’s going
Lily: through, I’m just going through
Generoso: like hundreds of different things
in
nick: I’m gonna
go, are we sure She’s not an ai I’m gonna
joe: with the,
Generoso: Wall-E
joe: got that right. That’s it. That’s just gonna go with that. I’m done. I’m gonna go a little more sinister. You know, I think we have Hal 9,000 space Odyssey. That was probably
nick: Oh yeah.
joe: and I think early horror, like I, I talk about in the episodes how early I saw some of these movies, but MU/TH/UR from Alien.
Yeah. And that was just there where you had that that controlling it, you know, chatbot other thing. So I’m like, we’re done episode, but yeah. But chatbots don’t really have an agenda right now or that we know of. They’re, they are
geo: do
joe: really? Yeah. D
They’re really they are there to they’re, they have no agenda when you prompt them and things like that.
But in how 9,000 mother, they had Skynet [01:15:00] they have, , from Terminator, they had agendas that this was their agenda and this was their mission and they were gonna be unstoppable. And Logic didn’t backfills in that. So I think from that point though, they were some of the , and then Otto from Wall-E, , that also that same
geo: Oh, I thought of another one, but,
joe: All right. Let’s, Lily has something I
Lily: I think I got it. I think because, you know, for one of the reasons why I struggle is that for me I think when I think of science fiction and I think of dystopian dystopian writings and cinema, for me, I’m always thinking about like concepts that are playing around more with perception a little bit of time.
So I’m not always in the artificial intelligence realm though. It’s like artificial intelligence adjacent. So, but I think like an AI that, and a film that involves some AI as well as like this interpersonal relationship and like understanding how we communicate and interact as humans. I’d have to go with Solaris probably.
I think that’s the
Generoso: original Tur Solaris. Yeah. Based on the [01:16:00] Lamb book.
nick: yeah.
geo: Wow.
joe: yeah. Wow.
And a book, a novel I thought of was Sea of Rust by Robert Cargill. Really? Oh yeah. So it’s a really, it’s a, he’s a, it’s a great novel and I think there’s a sequel to it. But the novel starts out with the last human being killed. By the machines. And so the whole book then is about the machines.
Now
geo: it’s the point of view of the machine, of the
joe: and this
geo: and the machine is very human.
joe: are, yeah. They’re very sentient. They have personalities and then they’re, , they are hierarchies of the machines. And following these scavenger bots.
And they’re trying to live, they’re trying to eke out a living, they’ve replaced the humans in some way. So it’s really you know, they do all
geo: which is just more about the human experience,
joe: But
geo: But not,
joe: you know, machines, and you get that after humans are gone, what are the machines gonna do? They’re built on human philosophies and, you know, so at some point, will they just revert back? We talked about that. Will they just cycle back,
nick: [01:17:00] So the Master sentinel? Oh
joe: yeah. From X-Men. Yeah. That’s it.
Generoso: Oh yeah.
geo: yeah.
nick: Sorry, I had to bring that one
geo: and then
joe: right? Is that
nick: right.
geo: our son just was, he’s away at college, but he was here for Thanksgiving and he wanted to watch a movie. And the movie he really wanted to watch was Blade Runner. Oh, that’s right.
joe: Oh, that’s right. Yeah.
geo: And that thing, think of that re That’s a good one too.
joe: Yeah. Blade Runner. Yeah.
Very good. Yeah, we could keep going. This was like, you know,
Generoso: we still recording? Actually.
joe: still are. Yeah. Yeah, we are.
Generoso: Okay. That’s awesome.
geo: there.
joe: Yeah. No,
nick: actually didn’t start
Generoso: and lovely. I think it’s
nick: I didn’t stop we
joe: there’s a end. There’s a special ending, so Yeah. People know who, listen, we, yeah. Yeah. We’re, we’ll get there. Okay.
Generoso: I was waiting for the special
ending, but I was like, are you going to add that after? Is it
during? I
nick: It,
Generoso: no, we could talk all night. Yeah, I’m fine with that. I’m just like,
geo: But we’ll, yeah.
joe: Yeah. All right. We should probably wrap it there. Thank you. You got me Joe?
nick: You got Nick?
geo: You
joe: got Nick. We’ve got Nick Georgia. We’ve got
nick: Georgia. And thank you so much again for [01:18:00] being with us.
joe: Hang on.
Lily: Pleasure.
nick: and we went down some ho
joe: robotic holes
nick: don’t forget to call me. Really?
joe: We love you. Stay safe.
nick: Bye-bye.
joe: Stay curious.