Sandro Botticelli [Public domain], via Wikimedia Commons

SPECIAL GUEST: Patrick Quinlan (Sexbot). He wrote the book on it. Yes, the author of Sexbot stopped by to chat about the future of the ultimate sex toy. One of the more popular, and controversial, topics of the upcoming future, the possibility of companion robots has tantalized, titillated and taunted. Sexbot is a thriller set in the near future, a world of corporate malfeasance, assassinations, breakthrough science, and major changes in society. Is this the world we'll be living in soon? Join us and find out. Recorded 8/25/2015.

 

You can download the episode here, or an even higher quality copy here...

 

Mike & Matt's Recommended Reading:

Patrick's Official Site

Patrick on Wikipedia

Patrick on Twitter

Patrick Quinlan Thinks You Are Going to Fall in Love With Your Sex Robot, by Brian Whitney (DisInformation, 6/19/2015)

‘Sexbot’ Author Patrick Quinlan on CIA Sexbots, Mind Downloads and Immortality, on Future Of Sex (8/26/2014)

More coming soon...

 

Transcript:

Alpha: Welcome to another episode of Robot Overlordz, episode #201. On the show we take a look at how society is changing, everything from pop culture reviews to political commentary, technology trends to social norms, all in about thirty minutes or less, every Tuesday and Thursday.

Mike Johnston: Greetings cyborgs, robots, and natural humans. I’m Mike Johnston.

Matt Bolton: And I’m Matt Bolton.

MJ: And joining us on this episode is Patrick Quinlan. Patrick, thanks for joining us.

Patrick Quinlan: Thanks for having me, guys.

MJ: So, our standard first question is kind of, Patrick, could you tell us, for our audience that maybe isn’t familiar with you, a little bit about your background and kind of what you do?

PQ: Okay, I guess I’m primarily known as a writer of crime thriller novels. Over the years, since the mid 2000s, I’ve written a bunch of novels, but they mostly had nothing to do with the future. In 2007, I was a co-author of a book called “All Those Moments,” which is the autobiography of Rutger Hauer, who’s a semi-famous actor who played the character Roy Batty in the movie “Blade Runner.”

MB: I’ve actually been a big Rutger Hauer fan for a long time, so.

PQ: When I was a kid, I was a huge Rutger Hauer fan, off the charts. And then my agent called me one day in 2005 I want to say, and said, “Hey, do you know who Rutger Hauer is?” And I was like, “Do I know who Rutger Hauer is? Rutger Hauer was my favorite actor when I was a kid.” He’s like, “Well, I want you to write a book with him, and he’s in a hotel room in Manhattan, and here’s his number, give him a call.” So I called him and we chatted, and then we decided to do the book together. We had to write a nonfiction book proposal, we did his book, which took a couple of years. More to the point, the thing that probably the reason I’m on tonight is about a year ago I wrote a novel called “Sexbot,” which is about a robotic scientist who invents the world’s most advanced sex toy, which they call the “Sexbot,” and then later she and her partner stumble upon the secret to human immortality, which is to download our awareness into intelligent machines. So, the company they work for decided to kill them for this technology that they’ve developed, and just before she gets killed she downloads her awareness into the best machine that she has available, which is the prototype for the 9th generation of Sexbots. So, she dies, and at the same time she becomes “Number Nine,” the Sexbot prototype. Then, you know, it’s like a crime thriller, she’s out to solve and avenge her own murder.

MJ: So, what’s the reaction to the book been like so far?

PQ: Oh, the reaction has been great. I’ve done a ton of media, sold a bunch, and not too bad. You know, pretty good; I’m happy with the reaction.

MJ: Okay. So, I guess, what is it about the concept of sexbots that interest you? I mean, Matt and I have talked about it quite a bit with a number of our guests, starting with Miss Metaverse all the way to several other people, particularly around when “Ex Machina” came out just this year here in the States. What is it about the concept do you find that people find fascinating? What is it about that concept that interested you in order to write about it?

PQ: Well, you know, it was not a concept that actually independently occurred to me. I’m actually one of these people who is almost entirely checked out of popular culture. I don’t own a television set, and it’d be almost hard to describe or fathom how incredibly checked out of popular culture I am. I live in Florida half the year, and my girlfriend has got this phobia of flying, so a lot of times we take the train from New York to Florida, and we get one of these rooms in the train. Riding the train from New York to Florida is like—if you don’t have a room, it’s like…I can’t even really begin to describe what it’s like. But if you do have a room, it’s really nice, and then you go to dinner, etc. One night we found ourselves having dinner in the dining car with a Manhattan real estate developer, very, very wealthy. The thing about riding the train is you immediately hit it off with everybody because everybody is there for the same reason, which is they’re afraid to fly. This is a fact, at least in the dining car, at least for the people who have rooms, within five minutes, you’re like, “So, why are you taking the train?” and invariably it’s because they can’t fly, “I don’t want to be on an airplane,” or, “I had a terrible experience on an airplane” or, “I know too much about airplanes,” what have you. And so we’re chatting with this guy, he’s a very, very wealthy guy, late ‘50s. Eventually he comes to a point and he says—he’s had a lot of women trouble in his life, and he’s been divorced a couple of times, etc.—and he says, “You know, very soon there’s going to be robots and they’re going to be so incredibly lifelike that you won’t be able to tell them apart from real women.” And I was like, “Yeah, really?” And he’s like, “Oh, yeah. Yeah, for sure. And when that happens, I’m going to buy three of them.”

MB: [laughs]

PQ: And so, you know, blah, blah, blah, we’re talking, and then he gets to a point and he tells me that he’s spent two million dollars preparing—basically his point was that very soon in the future there’s going to be human immortality and it was going to be available to very wealthy people, which he is one. And basically what it’s going to be is that you can download your awareness into a machine and then you’ll live forever. He spent two million dollars of his money invested in being—what do you call it when they freeze you?

MJ: Oh, cryonics.

PQ: Yeah, cryonics. He’s contracted to be frozen because he doesn’t think he’s going to be around long enough for the immortality to come about. So, he’s going to be frozen, and then when human immortality becomes possible, he’s going to have them wake him up, cure whatever was wrong with him that he died of in the first place, and then he’s going to be downloaded into a machine. And I said to him, “Well, you know, that sounds really nice except how can you trust these people that froze you to wake you up?” And he said, “Well, that’s where the two million dollars comes in. I have stashed money, commodities, investments all over the world, and when I get frozen there’s going to be a note, and the note says, ‘If you wake me up, I’ll tell you where these things are and then you can have them.’” So, this is the conversation I had with this guy. And, you know, I know nothing about this at all, and then I start looking at it with the idea—the first thing that occurred to me was that this sounds like a book. And obviously this sounds like a book because lots of people have written that book, and now I have too. So I started looking at this stuff and right away I came across RealDoll, which I’m sure you guys know what this is, right?

MJ: Mhm.

PQ: And then I came across the Geminoid, this Japanese scientist, Hiroshi Ishiguro. Are you familiar with this guy?

MJ: Yep.

PQ: Yep, you guys are way ahead of me. And then also the 2045 Initiative, Dmitry Itskov, all this stuff that’s going on. I’m like, “Well, you know, one of these days, RealDoll, they’re going to start thinking about putting artificial intelligence in robotics and their dolls.” These things are incredibly lifelike even though they’re, you know, 100 pounds or 70 pounds of meat that you have to move and put them where you want. Once they put some artificial intelligence in robotics, then these things are going to be incredibly convincing, right?

MJ: Yeah.

PQ: So that’s where the story of the book came from, was basically putting these disparate elements that I had frankly never heard of before, or never really thought about. And then it occurred to me—you guys are familiar with ELIZA?

MJ: The chat bot?

PQ: Yeah, right. And it recently occurred to me that when this comes about—this guy, Weizenbaum who invented this, he used to find his grad students in the lab talking to it at night—it’ll be…let’s put it this way: I have a 20-year-old Saab convertible, I talk to it and it talks to me, not that anybody could hear it. I’m in love with this car. And once these robots are talking back to people, people will easily be convinced. And as I’m sure you know, Weizenbaum spent the rest of his life campaigning against artificial intelligence for that reason, that people are so easily duped. These machines would put one over on them whether or not the machines are really intelligent, it wouldn’t matter.

MJ: I think that brings up kind of an interesting problem. I think you’ve alluded to it in some of the articles that you’ve got online about how that’s going to change human relationships. So, is that one of the themes of your book, “Sexbot”?

PQ: How is it going to change human relationships? No, I wouldn’t—well, on some level it’s more…”Sexbot” takes place from the point of view, for the most part, of the woman inside the sex machine, less so—no, frankly you’re right, it is. There’s a couple of people during the course of the book who fall in love with the Sexbot. There’s a guy who has been injured in Afghanistan, he’s a former military guy and his face is horribly disfigured, and he has a Sexbot that he’s in love with because he’s convinced that this is the only woman that will have him. You know, you go on these RealDoll chat rooms or what have you, there’s bulletin boards where people who own these things sort of write in and talk about ownership of these things. And the thing, it can’t communicate, it doesn’t think, it doesn’t say anything, and yet these guys are in love with these things, and invariably within five minutes of going on one of these sites, you’ll see somebody who says, “This is way better than a real woman.”

MB: Because they’re quiet?

PQ: Well, yeah, because they’re quiet, because you don’t have to sit through dinner—right, because they don’t say anything before uh…

MB: [laughs]

PQ: I had a professor in college who used to—I don’t think he would get away with this anymore—who used to say, “The only thing I want to hear from a woman before 12 noon is, ‘How do you like your eggs?’”

MB: [laughs]

MJ: [laughs]

PQ: You know, these things really appeal probably—I don’t know if it’s that crowd that they appeal to—but they certainly appeal to people who have trouble connecting with other humans or with human women; I think RealDoll sells the vast majority of their dolls to men. So, you know, when these things come about, sure, they’ll be very convincing, and yeah, that is at least partially a theme in the book. Although the book is more about her trying to escape from the evil corporation and avenge her murder and try to have some semblance of a life now that she’s actually dead.

MB: How far into the future do you think it’s going to be before we start seeing—I know there’s the real life sex dolls—but before we start seeing them kind of hit the mainstream?

PQ: I think it could be a few years from now. I just read some time over the winter that RealDoll is investing in artificial intelligence and robotics. The thing to be clear about is I don’t know when, if ever, these things will become self-aware, what have you. The big problem is, “When the machines become self-aware…that’ll be the end of us.” That’s the big trick. But I think that very soon, five years, three years, they’ll start to become really, really convincing, because part of what you’re doing when you interact with one of these things is you’re imagining. When I interact with my convertible, I’m just imagining that it likes me. It doesn’t really think about anything, you know? And that’s part of your relationship with your cat, too. You imagine all these sort of—what this cat is thinking about you, and the cat is primarily thinking of you as a source of food.

MB: [laughs] Do you think that people would actually want a…I mean, if we’re talking just strictly a sex robot, do you think people would even want one that was self-aware, or would they just want one that can, you know…? I would think I would just want something that can hold a simple conversation, you know?

PQ: I don’t know if people want that, but people get what they “give them.” I don’t think people knew that they wanted Google before it came. I think, right, people want to think that it’s real; I think they probably do. I think they do want it to be self-aware. But at the same time, they want it to be completely, completely empathetic and completely in the thrall of the owner, and they want it to be self-aware but they want that awareness to be their #1 cheerleader at the same time. I think that’s a very human tendency, is we want people to…if it’s sort of like a dumb robot but it loves us and thinks we’re great, then that’s not quite what we want. We want it to be really smart AND love us and think we’re great.

MJ: One of the criticisms I’ve heard is that that lowers the bar of quality of relationships, and that for people that do have difficulty interacting with—for example, the guy who has problems interacting with real women, or the few women that have problems interacting with real men, that that lowers the bar such that, well, now they don’t have to try; now they’re just somewhat more isolated because they have this sort of surrogate that maybe it can get to the level of a devoted dog, but that’s as far as it might go with some of the weaker artificial intelligence, maybe.

PQ: Sure. Does it lower the bar for those folks, meaning that they don’t have to…? Is that what you were going to say?

MJ: Basically. I mean, just that it would create almost more of a problem than not having it at all, I think that’s the criticism. I don’t necessarily agree one way or the other. I mean, I just think it’s kind of an interesting area to think about as far as the impact that might have. We had on the guys from—I don’t know if you’re familiar with the web series “Gigahoes,” but it’s sort of a web series comedy about a cyber brothel using sex robots. And the one character there, he has problems interacting with people, and yet he has no problems interacting with the robots. And it’s meant to be a comedy series, and I thought it was pretty funny. I don’t know what Matt thought.

MB: Yeah, no, it was well done.

PQ: I think that for folks that have trouble interacting with other people, I think that it could be actually a wonderful thing. I don’t think people get a whole lot better. If your thing is, “I have trouble interacting with people,” I don’t think, like if you read books and take a class, I don’t think you’re going to get a whole lot better at interacting with people. You might get 5% better over time and things are less horrible than they were. I haven’t met too many people that sort of undergo radical—most people I’ve known, I know you guys were buddies in school, and I have a few people I’ve known that long. They’re just the same as they were when they were 12. Good, bad, or indifferent, people don’t change all that much. If you’re really having trouble interacting with other people, I think that’s going to be a pretty defining point of your life and I don’t know that it’s going to change all that much. So, you know, these robots could be a wonderful companion for folks that have that problem. You mentioned a dog. Right, they could be just like—dogs are pretty accepting companions. Although, I had an attack dog when I was a kid that used to kill everything, but that’s another story. They can be very sort of accepting companions, but you can’t have sex with them, you know? You can’t take it to that level. Some people do, but they probably shouldn’t because the dog can’t give them consent, as far as we know. But yeah, they could be wonderful companions for folks that have trouble connecting with other people. I don’t see a problem with that. I don’t feel like we need to be stern with people and be like, “Well, you know, you have to try harder.” It could just be that’s how they’re going to be.

MJ: Yeah. And I guess I’m thinking, like, I don’t know if you’ve seen Matt Groening’s “Futurama,” where one of the characters gets a sex robot and his friends sit him down because he’s spending too much time with it, and they show him the educational film and it’s done in sort of the style of the ‘50s educational films, where basically Billy and his Marilyn Monroe sexbot just make out, and basically the economy breaks down, the world goes to shit and all this stuff because all Billy does anymore is sit in his room and make out with his Monrobot. And then they show him as an old man and he’s making out with the Monrobot, and he basically says, “It was a good life” and he dies. And that’s all he does for his entire life, and this is meant to be this educational film. I think that’s kind of the criticism that some people have made of this concept, is that if you can get everything you want out of a relationship by not having a relationship or having something that’s just like fast-forwarding to all the best parts in your favorite movie and not having to watch the movie, somewhat…That relationships are about somewhat the challenge of interacting with another human being; if you strip that out, something is lost. I’m kind of two minds about that. On the one hand, yeah, something might be lost. But on the other, like you said, for people that aren’t getting that, that could be amazing for those people.

PQ: I think that’s right. You know, Mike, right, the challenges of relationships can be really incredibly challenging sometimes, and you just kind of…It could be a thing, you know, you're in between relationships, I don’t know, maybe you rent one of these things for six months and then you have a nice six-month fling, travel the world with a sex robot. I don’t know. And then go back and try again with a human. Sure, part of the richness of human relationships is the challenge, and, as you say, some people, that’s a very—sort of a challenge that they’re not going to meet. Here’s the thing: the internet opened up this whole thing and then all these people are looking at porn all the time, right? It used to be, when I was a kid, you either found porn magazines in the trash or you had to like go into a store and buy one, and no one would ever do that because it’s embarrassing. And then all of a sudden everyone can watch as much porn as they ever could imagine.

MB: [laughs]

PQ: And more—way more—because no one would ever think of these things that go on, right? And that happened and the world didn’t end as a result of it. And I guess some people locked themselves away and watch porn the rest of their lives, but most people have to go out and earn a living, and they will. I don’t think there’s a tremendous danger of people just sort of isolating themselves from other human beings to that extent.

MB: You did an interview where you were talking about people using sexbots for people with rape fantasies and a child porn fetish and stuff like that. Could you talk a little bit about that?

PQ: Someone asked me a question about that and I had never thought of it. Sure, you could see there are people who have these problems and they cannot be cured of them. So, wouldn’t it be better—and obviously you wouldn’t try to develop the artificial intelligence in those kind of bots to point where the thing was actually experiencing this. Years ago, I used to work for—this was a long time ago—at a place that was a large child welfare organization, it served the city of New York. And there were all these girls in this facility, they were…for the most part, most of the kids in this facility were sort of violent criminals. And they were as young as 12-years-old, and they had been raised in families where the parents were dead or they were on crack or they were in prison, or what have you; essentially these kids had been raised by wolves. By the time they were 12, 13, 14, they were violent, violent kids with a lot of problems. Most of the girls were pregnant and had kids by the time they were in their early teens, and they used to give them—their children would be taken away from them and they used to give them these dolls, and they had to carry these dolls around when they went to school or whatever they were doing all day. They carried these dolls around, and this is 20 years ago, so the dolls were sort of rudimentarily programmed to cry, to be hungry, to wet themselves or what have you. The challenge for the girls was to not kill the doll, and the doll was programmed to calculate the amount of physical abuse; because basically what the girls were supposed to do was take care of the doll, and if it was crying, then to nurture it, and if it wet itself, to change it, this sort of thing. And the doll was programmed to keep crying no matter what they did sometimes, and what they tended to do was shake the shit out of this doll until it was dead.

MB: [laughs]

MJ: [laughs]

PQ: So that was the challenge, so it was better for those girls to practice on something that if they killed it, it didn’t matter, before they got their own babies back, right? Because obviously these girls are at high risk, if they get their children back, to hurt them, because they’ve been hurt themselves and they have very low tolerance for frustration. You know, so you could see how if you had a very lifelike doll and you had someone who’s a child molester or a rapist, it’s better for them, because they’re going to do it anyway, for the most part. All the criminology will say these people don’t get better. I shouldn’t say—obviously that’s a blanket assessment and it’s not true. Some get better, a lot don’t, and they can determine pretty well who’s going to get better and who’s gotten better and who’s lying. Better to give them a doll to work out on than a person; better to give them a robot. And also, now can you imagine all the data that you could compile on their behavior? “Are they going to offend again?” Especially if it was all uploaded to a computer, if you had a thousand of these out there, you’d compile a tremendous amount of data about the behavior of these folks. Which, you know, becomes “Big Brother” and you could put them away before they ever commit another crime because you know they’re thinking about it.

MJ: So, Patrick, to kind of take it back to your book for a minute, you mentioned the corporation a little bit. So, is the theme a little bit—is one of the themes, rather—a little bit the way that corporations have somewhat insinuated themselves into our lives, whether it’s big data or just products and things like that? I mean, it sounds like some of the murder mystery at least revolves around that corporation.

PQ: The corporation is absolutely the main antagonist in the story. Basically, yeah, what happens is this woman, she and her partner come across this idea that the mind is not in the brain but it’s in a sort of field around the body, which is obviously something you guys have heard before, and they find a way to more or less cut and paste it into—they’re working with chimpanzees. And when it works, they want to stop the project because they want to, you know, get it peer-reviewed and they think the United Nations should be brought in. And this is incredible technology, the company, all it wants to do is sell the technology and it wants to make a fortune off of it. So yes, very much so, the company is the major antagonist, and particularly the CEO of the company is the major antagonist, and he decides that “These two must be killed because we’re going to move forward to human trials before…” Basically, “We want to move forward and we’re not going to let the inventors of this slow anything down.” Basically the book is really sort of about how the company has its own security force and the company acts as a sort of lawless and autonomous entity outside of the structures of government control. When the police come, the company goons keep the police out of certain areas; the company has its own hospital, etc. It’s sort of an autonomous entity, the company, in this story.

MJ: That’s an area that kind of interests me a lot as far as how society is changing, that we have this huge disparity between the .00001% and everybody else. You mentioned the immortality for the rich, somewhat. Do you see maybe that society is set up to go to kind of a two-tier, nobility who are effectively like the Greek gods and they’re immortal, and then everybody else, who’s just kind of stuck in the muck, so to speak?

PQ: I’ve thought about this a lot, and I think that there’s a lot of possible ways this could go. I mean, frankly global warming could put a sudden and abrupt end to this whole thing literally overnight. I suppose that’s always there. They talk about the singularity and how artificial intelligence will become self-aware and then we’ll all be supplanted by robots, and that may happen. It seems sort of so distant and weird that I have a hard time thinking about that. But advanced robotics are coming, and they’re right around the corner, and they’re supplanting workers like crazy, and we know that the power elite who run the show—workers are sort of a messy question for them. Who’s that incredible war criminal who used to be secretary of state? Kissinger. He used to call the working class “useless breeders.” They will get rid of us if they can. If we get to a point where everything can be done by robots—all the food is being grown by robots, all the stuff is being made by robots—what do they need us for? It’s an open-ended question. And the other thing is, you know, have you ever seen—I’m sure you’ve seen this, of course you have—it’s owned by Google now I think, or maybe Facebook; somebody owns Boston Dynamics.

MJ: It’s Google, yeah. I was a big fan—well, not necessarily a big fan of Boston Dynamics—but their PantherBot is just scary as crap, I think. So, I was following them when Google bought them.

PQ: Right, it’s a big dog, right, was their first thing. My girlfriend was in Cambridge with her sister, and they’re walking, and they cut through a parking lot in an office park somewhere. This was, like, 6-7 years ago. She came back and she said, “I saw something out of a nightmare today. I can’t even believe it’s real.” I said, “What is it?” She said, “We were walking through a parking lot and they had this robot, it looks like an animal, and it was running back and forth through the parking lot and there were a couple of guys. I don’t think I’m ever going to sleep again.” And so we googled this thing and we got an early video of it on Youtube, and you know, they’re so much more advanced now than they were then. Can you imagine when they put machine guns on these things and there’s, like, a hundred of them coming and they’re loaded down with .50 caliber machine guns? That’d be tough to beat. Suppose Google were to become the—I suppose they will-the One World Government and they set these things loose on people? I think we’d win anyway, to be honest with you. I think people are clever enough to beat these things. If they’re not superior intelligence, people are so creative that they’ll come up with ways to beat these things. But initially it’d be pretty horrifying to have to be faced with these things.

MJ: Yeah. My nightmare scenario was kind of if they were to deploy something like that in a protest or riot situation like, say, Ferguson, to have those things chasing down protesters. Actually, I thought more of the protests against things like the G8 and things like that. You take some of those grievances that the population has that might be a little more on the politically respectable side, I guess.

PQ: Sure, sure. I could see that. I imagine it will go to that. That wouldn’t surprise me in the least if they had those things. Years ago when W. Bush was elected, I went down to his inauguration parade with a friend of mine, we took the bus. There were like 50 buses in a line going down to Washington, D.C. to protest the inauguration parade. And they parked at RFK Stadium, in the parking lot there, which had just been retired as a football stadium, and then they proceeded to sort of march everybody into these little zones where, you know, the protest zone was nowhere near the parade. The cops were there. My friend and I slipped the noose and we made it downtown to the parade, but a lot of people got put in these little pens, where I guess they were supposed to shout and wave signs at no one. I could see using these big dogs to sort of herd people into these sort of “free speech zones,” and, you know, if the protests got violent or what have you, then to set ‘em loose on folks. I could see that. That doesn’t even sound farfetched; that’s not even really that terrible, actually. You know, I could see it sort of worse than that. I think eventually they’ll set ‘em loose on battlegrounds and see how…If we had them when the Iraq war started, I think they probably would’ve set ‘em loose. I’m surprised they haven’t introduced them in the Afghanistan fighting.

MJ: Yeah. Well, and I think both parties actually have done those free speech zones, and not just at the national level, I think at the local level, even.

PQ: Sure. It’s hard because we live in a society where there is this incredible gulf between the people in charge. I guess that’s the world. But yeah, you’re right. So, you know, if we get to a point that the robotics can make everything that the elite need to live off of and they can meet their sexual needs through sex robots and all the rest, then they really have no use for the working class. We have no use for them either, but we don’t have our hands on that kind of robotic technology.

MJ: Yeah. Is there anything else that you want to make sure we cover, Patrick?

PQ: No, I mean, just buy my book, “Sexbot.” Very entertaining, successful, suspenseful, and it’s got a sex robot in it, so how can you lose?

MJ: And where can people find you online if they’re curious to know more about you or the book or your other books?

PQ: They could go to www.PatrickQuinlan.com, or I have a blog which I update once in a rare while, www.TheOptimist.com.

MJ: Okay. Well, Patrick, thanks so much for joining us tonight.

PQ: Thanks Mike, thanks Matt. Appreciate it.

MB: Thanks.

A: That’s all for this episode of Robot Overlordz. Are you interested in the future and how society is changing? We’d love to have you join our community. Visit our website to learn more and to connect with others that share that interest. You can find us at RobotOverlordz.FM. The site includes all of the show’s old episodes along with complete transcripts, links to more information about the topics and guests in each episode, and our mailing list and forums. We’d also love to hear what you think about the show. You can review us on iTunes or email us.

MJ: I’m This email address is being protected from spambots. You need JavaScript enabled to view it.

MB: And I’m This email address is being protected from spambots. You need JavaScript enabled to view it..

A: We hope to see you again in the future…

MJ: Thanks everyone for listening.

MB: Thanks.

 

Image Credit: Sandro Botticelli [Public domain], via Wikimedia Commons