Does technology give us control or the illusion of it? We explore how societal expectations, the nature of work, and AI challenge what it means to be human, contrasting the allure of self-sufficiency with the call to vulnerability. (46 min)
Links
Transcript
Heavily edited this for reading ease. (edit)
Intro
Melody: We came into this saying we both didn’t prepare a list of things to talk about. But maybe it’ll make it more authentic. ‘Cause we only met each other not too long ago. Randomly. Through Josh. And then through the internet. It randomly popped up on your Twitter feed. I feel the setup for this is great ‘cause then I feel there’s a lot that I’m super curious about for you, and then also you don’t know as much about me. There’s a lot to unpack here.00:00
Henry: Oh, yes. Oh, that too. I randomly saw you on my feed.00:18
Henry: The mutual Twitter followers is really helpful. You know Nicole, you know Jeannie, all these people that I know, that helps it get going.00:36
Melody: Establish trust. And Nicole’s been on the podcast before. On my end what’s super interesting to me about what you’ve been doing with Hope In Source is in general I’ve had a background of exploring this intersection of faith and technology and work. And I moved here to New York in August after 10 years in San Francisco. And I feel during that time, I cultivated my community of that out there. But I always felt a little bit, I don’t know if ostracized is the right word, but this feeling of not finding… If you did find someone who was at that intersection, you clung onto them? So we built this little community out in the Bay. I don’t really had much more exposure to that outside of that little community. And I when I saw that you had this whole podcast, and that this was the intersection that you’ve been in, in the middle of, I found that super interesting. I still don’t know the full story, and I feel even over lunch, I’m asking you more questions. But, yeah, I feel there’s a lot to talk about at the intersection of faith, technology, AI, building. The fact that you are doing your own thing and working on your own projects, is also super fascinating. The creative process is something that I’m super interested in. Throw out some. Or the fact that even meeting you and that this community exists in New York. And that we still haven’t met. We didn’t cross paths before.00:45
Henry: That part is weird too. We all have that feeling of are there other people out there. Yeah, here.02:29
Melody: Yeah. And you hope that the algo can take you there, but sometimes it doesn’t.02:34
Technology and Human Worth
Henry: In the church setting, whether it’s the leadership or people in church, they don’t really get the intersection either. That’s even true for people in tech. They don’t even see it. Especially with open source. And then, is in tech are people thinking about spiritual things maybe in a certain direction.02:38
Melody: Yeah, that’s a layer that’s new to me too, the whole open source part. Yeah, it is curious, with that, why people are thinking about this more now, I feel. There’s two things. One, at a macro or political level or cultural level, there is a rightward shift, which I don’t want to conflate Christianity with being a right person. But I do lots of people are exploring. There is this swing towards religion. Not Christianity. It’s everything. Um, that’s one element. The other piece is, um, this question of AI. It’s interesting even talking to… I went home and I was talking to some Korean elders. And the topic of AI came up. And this was straight up a Korean ajusshi is what I would call it. He was, “And this is now begging the question of what does it mean to be human?” And I was, “Whoa.” I didn’t expect to be talking about this over fried chicken. You know what I mean? And I do AI is also bringing up that question of what is personhood and all this thing. Um, and I that’s where people are searching for answers. And is that faith? Is that… I don’t know.02:51
Henry: It does challenge our default assumptions on what it means to be human, because maybe our society is such that to be human is about what we produce in our market-driven economy. If you’re not increasing GDP or you’re not able to make stuff, or there’s something out there that’s not a person that can make more or seems it can, better, faster, um, then it’s what are we? You could say it’s good because then it might lead someone, even myself, to think, oh, my worth and my being, it shouldn’t necessarily be tied up in that?04:18
Melody: I haven’t heard that one before. That our humanity for a long time, yeah, in the context of capitalism perhaps it’s been tied to how much we’re able to produce. And now that that’s being taken away or challenged, then it’s what does it mean for us to be human?04:59
Work Beyond the Title
Henry: There’s a huge psychological issue culturally around work. Uh, if you’re not working, if you’re unemployed then you’re a loser. Just ‘cause you’re not at a nine to five doesn’t mean you’re not doing anything for yourself, for your community, for God. It’s-05:18
Melody: 100%. I love that, or this topic. And it’s something that’s been super top of mind ‘cause I’ve been talking about this with my girlfriends as well, where it’s there is this feeling that a lot of domestic labor has been, devalued over time. And, uh, you find people who, would consider themselves feminists, but then it’s struggling with the idea of “I also enjoy sewing and qu- things that would be traditionally seemed very trad wifey.” And it tends to be… I there’s this, I don’t know, internal battle or challenge. And my take on all of that is it’s related to what you said where it’s our view on work is we think that it’s a job. Work is only when you are nine to five doing a laptop job. But someone at my church once said this to me, I mentioned this to her. I was, “Oh, and one day when I stop working…” And she looks at me, and she goes, “Melody, you never stop working.” And I was th- that was profound to me because it’s true. And that’s why, you’re saying, the work that we do for a community when we serve them, if we’re preparing meals to go deliver to a sick friend, if we are visiting someone in the hospital, if we’re doing work for our own families at home, that’s all work. And the fact that we don’t call it work is it devalues it, and it takes much away. And we’re tied to our value being when we have a full-time job. And that’s not great for moms or people who, or families, people who, or people who take nontraditional paths. Yeah. It’s not helpful.05:35
Henry: Or even both of us right now. It even comes down to the level of if you don’t have a name for it that’s easy to understand. Is it legibility? It’s oh, it’s not okay, and it’s funny that that would prevent you from doing what you want. If you’re in a good financial place that you could choose to do something different, you still don’t because of the culture. Yeah. When I first quit, I didn’t get any money. Take a leap of faith. Maybe people will give me donations. I had a lot of people messaging me, and they’re always that tone that they have is very concerned. It’s “Are you okay?” And then, and that made me second guess. I’m “Am I okay?” And all these things. And now it’s been a long time, no one asks anymore, which is good, I. Or maybe, yeah, maybe it’s in the back of their mind. Something that motivates me is the desire to pursue what’s surprising, I.07:36
Unpredictability and Decision Frameworks
Henry: And I, I wanna tie that into faith. The American dream, I go to a certain school, get a certain job, get married, have kids, retire, have a house, and then I die. I’m still gonna die in the end. Why have to go through that route? And almost a sense of adventure. The mystery of it can be appealing. It’s also very scary.08:29
Melody: And would you say you’re a very risk-tolerant person?08:50
Henry: I don’t know if I would use that word. The point of options is to make a choice. Yeah, that is a risk, ‘cause there’s always opportunity cost if you wanna use that framing. If I keep analyzing in a rationalist type of way, I would never do anything.08:56
Melody: Yeah. The way I think about what you said is also the fact that even if it’s a quote unquote predictable path that you could take, the reality is that those steps aren’t guaranteed either. I it’s a facade? That society tells you that, you go to a good school, you get married, and you have kids. You get a good job, and it’s actually none of those things are guaranteed either. Um, and and maybe what it is by taking this quote unquote riskier path, we’re being honest with how unpredictable life can be and that nothing’s guaranteed. Um, I, uh, I feel there’s a set of principles that I operate off of, but I haven’t written these down anywhere, I need to get to a point where I can do it. But one of them is this idea of, nothing is guaranteed? Not the next hour, not the next day. I know, and people think I can be a little bit morbid by saying that, but it’s yeah, how do we know we’re gonna be here next month? There’s no guarantee that that’s gonna be the case. And my favorite decision-making framework is this idea of, if you died in two years because it has to be long term enough where you’re not eating junk food every day. But also has to be short term enough that, it’ll challenge you to take maybe riskier decisions or more, to me, it’s more aligned decisions with what you want. Um, yeah, I use that frame to figure out what do I wanna do. It convinced me to move. That’s how I made the decision to move out here. Um, yeah. I don’t know.09:11
Allure of Control
Henry: It’s funny, I can’t help but bring up Ivan Illich, because his life inspires me and also his way of thinking as a Catholic historian.11:02
Melody: I haven’t heard of him.11:12
Henry: He died in 2002. He tries to use a metaphor to talk about our technological society, its ability to control. And it creates expectations on what we can do as an individual and society. And I that relates to your point about nothing is guaranteed, but technology makes us feel that way. You said, I your word of facades. The more powerful our technology, we have even more façade. He tries to contrast two words, hope and expectation. Expectation is what technology gives us. That it’s gonna rain tomorrow, what child I’ll have in the future. All these things you can predict. AI is all about prediction. My motivating factor is how do I live in the present and not think of these fake ideals of the future of this idealistic world, both in the past and the future?11:13
Melody: Yeah. What do you mean by in the past?12:07
Henry: Yeah, when things were good. You can’t. We can learn from the past. Both of those lead you to the ends justify the means thinking. I will do whatever it takes to get to my ideal scenario. That’s every single bad person, but they all have good intentions.12:09
Melody: I haven’t really thought about it that way, but I I the way that you framed technology. It gives us that idea that we can control things more and yeah, set more expectations. And you mentioned AI in all of that. How do you feel AI is making us feel that way even more?12:26
Henry: Yeah. At every level. Whenever I’m coding, I’m using it, and then I’m asking it questions about life or personal things and all these things. Where’s the nuanced take? It’s not it’s all good, it’s not all bad. I’d go back to the psychological effects of it, our dependence on it. I of it more of outsourcing what we would normally ask someone, our friends or our family to do, we ask AI to… Thinking more a therapist and stuff that. Yeah. Pastoral questions, yeah. An inherent issue with AI is that it’s not embodied, and I that makes it lack something. Um… And you could say, oh, once we have robots, then it’ll fix it. But I don’t know if that’s even true. We’ll see later, but, um, yeah. Now we’ll give it one. And I’m saying, we think there’s a fundamental difference? Between people. I don’t want to be saying I feel that way, uh, ‘cause there’s something special about us.12:49
Personhood
Melody: But what would it fix? The fact that it’s not embodied? Is that what they’re saying? We’ve talked about a little bit, I, when we were initially talking about doing the podcast, I the Catholic Church had put out that, uh, it was Antica e Nova or something. It was “The Catholic Church’s Stance on AI.” In general, I’ve also been very curious about this question, I even once AI was coming about because everyone’s scared of AGI and all this stuff. I do, and I feel there is a theological framework for how to think about personhood and… But I the primary thing, the reason why we are created different is because we’re created in the image of God? And I do us being created in the image of God gives us… I a big part of that is our rationality. I it’s also our soul. Um, and AI is never going to have those things.13:35
Henry: How should we think about, what does it mean to be created in the image of God? ‘Cause we say that a lot as Christians, and I both… I don’t know if we understand it ourselves, but then also as a non-Christian, it’s what does that even mean?14:52
Melody: I don’t know, when I did… I did… It’s funny. I did a deep research query on this. I used AI to write this down. Meta. Um, and let’s see. I’m trying to… I don’t know if I’ll remember everything that was… that came out from there, but I do there’s something about our intellect, and this whole, rationality, those things, and our soul, together, those two things combined are important for… Maybe that was how they define intellect or something that? And when you think about AI, it’s to me, I of that as knowledge. I that’s devoid of the soul? And yes, AI can process knowledge at a greater throughput than any human can. But can it, reason, from a place that… I the soul piece is what’s missing, and I that’s what we… Being created in the image of God is very much the interaction of those things. It’s also what differentiates us from animals? Um, again, I’m not the most well-versed in the exact philosophical framework here, but that’s, directionally.15:00
Self-Control
Henry: When we talk about the fruit of the Spirit, one of the last ones is self-control. And, I that is one thing that differentiates us from the animals. The animal lives on instinct. If I’m hungry, I’m gonna eat. I’m gonna hunt for food? But we can suppress those desires? Um, both for good and for bad. I that has to do with what it means to be human. The example in the podcast I was listening to is, God chooses to rest on the seventh day. That’s, you could say, an example of self-control. He doesn’t have to keep creating. He stops. It wasn’t for a practical reason. He thought it was good. And then Cain kills his brother out of anger, and in that moment, he was not able to exercise control and acted, you could say, an animal.16:18
Melody: An animalistic instinct. Yeah. That’s super… That’s super interesting to me. I, um, what I’ve also heard, and again, I don’t I’m gonna explain it quite well, but it… I humans have this ability to be able to detach themselves from their current state and imagine a future state. I feel that adds to the self-control piece, where it’s you can almost imagine what the consequences are for two hours from now or something. Yeah.16:59
Self-Perception
Henry: It’s funny ‘cause you could say that we’re simulating, uh- Which is what an LLM does. Another side point would be that I it’s funny how our conception of who we are is always seems to be based on whatever the latest thing that we’ve made. Now, people are, “Oh, we’re an LLM.” We have training, pre-training, all these things. The vocabulary we use to describe ourselves is whatever the latest thing we’ve made in our image. And I that’s funny. Um, and before, we would use metaphors wheels or the car or machine. It’s funny that we keep changing our definition of who we are based on our latest thing. Um, and again, I said, in our image. And, I that says a lot. Um…17:26
Melody: It is really interesting. I’ve heard that, where it’s the tools shape the way that we think? Yeah.18:21
Henry: Barry McLuhan, which is another person that I referencing. Oh, I did. Yeah. His book, Understanding Media, the subtitle is very helpful. It’s The Extensions of Man. … the student of media is the technology. It’s what we make. That’s not us. And, they extend our senses. They extend this microphone can extend my voice, and it extends our brain. His subtitle explains the positives where it makes us… We could do more. We have more power. But it also can do the opposite. And I his other phrasings, they also amputate us in a way. We lose our own senses.18:28
Melody: That’s interesting, ‘cause this brings us back to this idea of legibility too? In a way, it’s the tools anthropomorphize or something or make legible. Yeah, they’re extensions of ourselves but… And again, maybe calling back to this whole idea of control, too. It’s we, we wanna have a sense of, we wanna make everything tangible, it’s we can understand it. And it’s scary to us to exist in more of a void. And if we can imagine… If now we have a framework of the LLM to talk about our human feelings or, I don’t know, scary things that we don’t have control over, we wanna lean… It’s a crutch, essentially. Um, I don’t know.19:09
Technological Coping
Henry: Uh, yeah. No, I agree. That feels a technology as a whole is a coping mechanism to not be able to deal with our problems. I feel that way. I don’t wanna feel negative emotions. And I’ll turn to whatever it is that you want to use to cope rather than facing it. And I, I it’s funny that we create technology and we lose spiritual technology, if I wanna use that word. It’s prayer and Sabbath and all these things. Let us face our demons or our problems. Um, because we’re giving up, the things of the world. Yeah.19:58
Melody: I really that contrast. I that’s something we need to amplify more. I don’t know, it’s this idea of, yeah, technology gives us this illusion of control and we’re continuing to move in that direction. But much of being a Christian is about the giving up of control. And it’s all about humility. It’s all about lifting up to God. And that is the scariest and hardest thing to do. Prayer is the most obvious example of this, in that… And this is, yeah, with AI, how much more are we gonna turn to, asking ChatGPT what we should do in this situation with our parents versus lifting up the issue to God? I personally am already doing that, probably more than I should. Um, I really… Yeah, I, I these co- concepts in context of each other, because I it signifies much of where we’re headed as a society,. It’s we’re speeding towards more control and yeah, wanting… More anxiety, more, uh, what is it? Ah, gosh, this idea that we have control over all these things, when I Christianity and what God is reminding us is that we don’t, and that we are best left off, leaving it up to Him and trusting in Him.20:38
Henry: And maybe they’re doing it too. Your parents. If we don’t have control then He will reveal Himself eventually. Meaning that whatever we put our trust in, which I’ll say is that’s where we’re putting our hope, that is our god? The thing that ultimately gives us a reason to wake up every day. Um, if it’s not God, it’s gonna fail. Um, and that’s what happens when I realize I’ve been worshiping another god. It could be that, as simple as, uh, one day it stops working ‘cause it’s down. I realized that I was relying on it too much? That’s the essence of idolatry.21:37
Deliberate Choices
Melody: 100%. Yeah, and this brings to mind, we talked, touched upon this briefly. And, you know that there’s a lot of… This is a very nuanced and sensitive topic, but the idea of, um, even IVF or, technology around fertility and pregnancy and that thing. There are some times when I start to think when are… You know, when you’re jumping through too many hoops, when is it too much? Obviously, you want the opportunity for everyone to be able to be parents and that thing. But, um, yeah, I know that’s a very controversial thing to say. But I in general that it, to your point, it’s when do we start to idolize some of these things? And when are we using technology to be able to enable us towards these things that maybe it’s, we have to approach with an open hand, I? Um…22:56
Henry: No, yeah. I don’t want my default to be that it’s a yes to using some technology. And I don’t want to be Amish, but if what their philosophy is is that they vote as a community what is worth using and what’s not, and that’s the way it works. Not that they wholesale remove all technology. Um, and, uh, of course, that’s not necessarily how it plays out in practice, but the ideal of that is helpful. When the default is either/or… it means you’re not thinking? You’re not really thinking about the implications, um, of what we’re making. Um, and again, that doesn’t mean that we shouldn’t pursue IVF for whatever it is.23:57
Unexamined Accelerationism
Melody: Yeah. And it’s crazy that it’s being accepted. No one’s talking about it. Literally, the other day at dinner people were talking about getting dogs, and now it’s I’m “Isn’t that what Gattaca…” Are we not, are we racing towards this path? ‘Cause, again, it’s the whole, “Yeah, because we can and that technology is a net good.” And it’s and I definitely fall into that sometimes, even with the AI stuff. I’m “Yeah, it’s inevitable. We’re gonna…” I’vedefinitely used that argument before. And, but I wouldn’t, I’m still wary to say that about artificial wombs and us racing down, and cloning non-sentient humans to have organ, be replacement. Where do you stand on that in this accelerationist argument for AI? Do you, yeah, what do you feel is gonna happen? We talked about how, at least for me, I’m not afraid of, the sentience piece. I, I am curious, what people are nervous about. I the picture that they paint is that one day the AI will get sentient. Yeah. And try to trick us and take over the world and eradicate humans. Is that what people are? I don’t know. Suddenly we’re in a simulation, da-da-da. I don’t know. There’s a huge spectrum. That is one aspect of it. Um. Anything’s possible, but I, I it’s, I don’t that that matters to me. Meaning that, I that other things will happen that could also cause the end of the world that would happen before that happens, if that makes sense. I said about how it’s already changing our dependence on it and changing our relationship to other people. I feel that already leads to a lot of bad outcomes, um- Huh, ‘cause that will happen earlier than our fear that, it’s gonna take over the world and then, I don’t know, everything gets bombed or something that. I, that, to me, it is, if that is possible and can happen, then that is a concern. But, along the way, there’s plenty of bad things that will happen. It’s why are we- And I know there’s this, especially in the EA circle, it’s, there’s a, it’s all about the doomsday scenario. It’s if something is a 1% or .00001%, but it will destroy everything, then we need to put all of our resources into it. And I feel you can co-opt a lot of people’s careers and lives to get them to think about that, but we don’t even know how to… It’s funny. We don’t even know how to care for our own family. And then now we’re concerned about this? It’s- It’s there’s many levels before that. You’re, “We don’t…” I said, I don’t even know how to cope with someone getting mad at me. And it’s that’s simple? And then now we’re saying, “Oh, the AI’s gonna take over the world.” Uh, we don’t even know how to, um, yeah, take criticism or forgive people. Yeah.24:37
The Proximity of Care
Melody: It’s almost the social breakdown is what we should be- the breakdown of the social fabric. The loneliness epidemic. I completely agree, and this is a social thing that we’ve dealt with. Where, yeah, we love the idea of a grand cause to work towards versus being kind to our neighbor. And, we will get worked up for all these social causes, whether you’re on the left or the right? And it’s we wanna go to missions. I’m strongly proponent of missions. But you need to be serving your church, serving the neighbor next door. Do you even greet the people around you? Do you get up for the grandmother on the subway? If you can’t even extend this level of kindness, why are we even thinking about the greater thing?26:42
Henry: And there is an assumption with that is that- Yeah, there’s concentric circles of care. Yeah. And not everyone agrees with that. And I would say it’s more of a globalist view on things. It’s everyone… We even say that, everyone’s equal, all these things. I that most people do care about proximity. And it’s not an arbitrary value. But I also it’s easier to love people that are really far away. Because, um, you don’t have to experience the pains of what it means to be in a relationship with in a real community. Uh, you give them money, which is- Necessary things. But I almost it makes you, um… Again, you can hide behind that, in a way. I don’t know. It makes you feel good, “Oh, I helped all these people.” And then, the person right next to you. There’s this weird cognitive dissonance, I, that gets created when we care about people in the abstract. Everyone has different levels of, you could say power or influence to be able to do that. But it’s interesting that most regular people, they still feel this need to help everybody. If we all helped our neighbor, - but it’s oh, that’s too hard because, we gotta change the culture. It’s funny. Our reason to make all this technology is ‘cause we’re unable to have real community? We scale our lack of character and humanity by scaling technology instead of ourselves. But it’s funny that that’s more impossible to us. ‘Cause we don’t have faith in ourselves and in each other, in people. Oh, no, no, that’s why we’re relying on robots to do it for us, or AI. Because they will do what we want them to do.28:59
Melody: Hmm. Yes. It’s the stuff… The stuff doesn’t scale to do the kind things for your neighbors. But what you’re saying is that if we do encourage that as a culture, that is what can scale? Yeah.30:28
Self-Sufficiency and Vulnerability
Melody: We don’t wanna be uncomfortable. But what were you gonna say? Sorry. Ugh. I feel that’s heartbreaking. Yeah. But to your point, I all of this points to this idea of we don’t wanna be… Even our lunch conversation was about this, I feel. It’s we all wanna be super self-sufficient. We want technology to take care of all of us for us. And this technology gives more people access towards this self-sufficiency. I feel that’s what it is. And, I feel we’re gonna keep butting up against this. And I never really thought about this contrast until this conversation. But it’s yeah, it’s the technology gives us this façade of self-sufficiency. But it’s really the thing that I encourages stronger, a stronger fabric of society, I suppose, is yeah, being willing to be vulnerable, being willing to not be self-sufficient, being willing to, deal with uncomfortable feelings where we’re brushing up against each other. We’re, we’re sad, we hurt each other and. But it’s that’s what we need to lean into more. And yeah. And technology maybe abstracts all of that away from us. Or gives us a fac-, gives us the idea that we can abstract it all away from us.30:56
Henry: Yeah, it does. Right. And it works too. To some degree. I that’s the… It’s effective and it works. I that’s the problem. Is when we say it makes us feel self-sufficient, it doesn’t mean that we need it. We both work in tech. The point is that you cannot ultimately be self-sufficient. Um, meaning that you can’t rely on it entirely. It’s clearly helpful, and we need it, and I working on it. But at the very core of it, um, that’s not what gives me hope that we’re gonna solve any of this because we made some new app.32:29
Engineering Out Discomfort
Melody: Yeah. How do you think we ended up here? ‘Cause I feel as a society, one clear example of this too is, um, we don’t see graveyards anymore. Back in the day, graveyards were, near, the church would have a graveyard next to it? And we would encounter death on a more regular basis. And it’s almost we’ve sanitized our society and our lives to, we desperately want no sadness or no discomfort, or we don’t wanna deal with death. And it’s we have these, these very sanitary lives. And we’re saying I technology is enabling us to live those kinds of, extreme, it’s extreme convenience, extreme, I don’t know, only happiness, happiness only. I don’t know. Do you get what I’m trying to say?33:08
Henry: Yeah, no, no. I a lot, that’s good. Um, yeah, it hides the fact that there’s suffering going on, I, um, in many levels. - Illich has a whole book called Medical Nemesis about criticizing hospitals and the whole industry. I his phrase, “From womb to tomb, the doctor has control over your life.” You were born in the hospital. You die in the hospital. Now, some of us don’t even see our parents or our grandparents when they’re dying, which is crazy. And again, hospitals have done a lot of good. But the… pointing out that there clearly, there’s some issues around that. The graveyard thing is interesting. Why do you have to… When someone dies, why do they have to pay $50,000 for the funeral and stuff? That’s insane too. Why is there even a job, a professionalized industry to do that? And you could say the same thing about life. And I it’s hard to escape, um, because, yeah, it’s the default. You can’t… To not do that would be weird? And then- Having a baby in the hospital or or dying in the hospital. To use that- It’s funny. Not for the career but for your life.33:59
Melody: To follow the path, we were saying. But they’re all, these are parallels. It’s all about control and this guise of expectation that society, that modern society has written out for us. ‘Cause a job career and job path was not something 200 years ago that, a laptop job? We expect more things with fewer downsides.35:17
Hidden Labor
Henry: Many things are very recent, which is interesting. Not being able to understand history. It’s we have a lot of things we expect now. Yeah. Sometimes, technology or society hides bad things. When you live in a city, you use the subway, you see the trash all over the street. You said, the graveyard was next to the church, everyone saw it. But then we hide everything. We pay for someone else all our trash is gone. You don’t see… And even AI, it’s uh, yes, the interface we see is ChatGPT. But it was trained on, all the data on the internet. And they also had to pay, or not pay, I, um, lots of people to label data? Yes or no, yes or… Right or wrong. Um, that’s all manual labor? Facebook? They have to have people that they all got PTSD from figuring out whether these things are good or bad. And it’s oh, it’s all AI, it’s automated. It’s… But it was on the backs of all these people who have to deal with that. And that’s everywhere?35:43
Melody: Yeah. Yeah, it’s we’ll look back to that and be it’s the cruelty of- It’s a different inhumanity that coal miners had to go through in order to produce energy? - In the same way these data labelers, were subjected to these horrendous conditions to give us a clean, a sanitized internet. Or sanitized data that we can use for our AI.36:48
Henry: Yeah. You’re mentioning what does it mean to be a stay-at-home parent. I really his concept of… He calls it shadow work. Work that is unpaid. That is required in order to consume some good that you buy. One example of this is the self-checkout line. When you buy something at a store, it’s that was someone’s job to do that. And now we’re doing… You could say it’s free labor, for the convenience of us getting to be able to get out. Or the easiest one is commuting, is you don’t get paid for commuting. But it’s a part of your job. It’s there’s a lot of labor that all of… real work that we all have to do in order to function in society. Usually an answer to that is, “Oh, we should pay people for it.” It’s funny that the answer is always… Wait, - The other one is childcare. He doesn’t it works because the amount of money that’s in the shadow economy is more than we have. You cannot measure how much money that is. You’re always gonna pay them too little. So.37:11
Melody: Yeah, I and that… I feel that… This question has been floating through my mind since the last topic we’re talking about, but it’s are we… The whole, how did we get here as a society, and it’s… I don’t know. Is it that… I don’t know if it’s capitalism. Is it everything’s broken down into some economic value and we’ve c- we’ve sliced and diced every single service? Childcare, hospital. School. I don’t know. And it’s- And that it… we’re saying it, it breaks down the social fabric. Or, where it’s we… When it is important, it’s there’s a intangible value to serving each other in these ways that goes beyond any economic value that you can boil it down to. Yeah.38:17
Tech as an Act of Love
Henry: And, and because we cannot explain it doesn’t mean it’s not there. I it’s easy to be, “Well, you can’t measure it.” I that’s a framing already. Um, having to be able to measure it or convert it into money. The value of something based on what you exchange is the money- the monetary value. But there’s straight up the utility of it. There is a lot of worth in, um, cooking for yourself. Not because it’s… it might be cheaper. But, how do you measure the love you have for your children? Or your friends? And I really the idea of technology can be an act of love too. I love studying this essay called An App Can Be a Home-Cooked Meal. And this writer, Robin Sloan, he’s not a programmer, but he wrote an app for his family. It’s a social media app with for people. And he never has to update it. There’s no login. They use it every day for the last, four years. And he wrote it in, a weekend. Not that no one’s ever done that before, but, it’s helpful to be, “Oh, I don’t have to make a Facebook.”39:23
Melody: I love that. To your point, I feel… I we lose something by trying to boil everything down to some, quote, unquote, “economic value.” I this points to this whole idea of, again, our culture’s obsession with control? Once you can put a number to it, we feel we have some control. We can measure it, we can improve it. What’s interesting about then in this era of AI and vibe coding you wanna call it, we can now… Technology can now maybe be divorced from only the big players who have enough money and the right incentive structures to build technology tools. But now we can… Now more people have the tools to be able to do what you mentioned. We can have more home-cooke- We can use technology as a home-cooked meal.40:26
Henry: There’s always been this dream of what people have called end user programming. Which is the users themselves can code essentially. You don’t need to know how to program, and it’s true that LLM does help with that. Yeah. Yeah. Yeah. Sometimes it feels it’s easier to point out the bad, and, I that if we are in the industry, then we can point out what examples of how you can use it.41:16
Melody: Yeah, it’s interesting. It’s… With any technology, it’s it’s enabling lots of good. And it can enable lots of bad. Yeah. It sounds then at a super high level though, it’s what are… what are the principles? What are the… what’s the culture we wanna encourage around the use of all of this? How do we… How do we use it to amplify the good, how we can serve each other using technology, serve each other better, rather than having it completely replace all of our interactions with our community and that thing?41:31
Finitude
Henry: We were talking about how technology is legible. It makes things easier to measure. Another concept I from Illich is this idea of what is valuable, which is what is good. Value is measurable. But that’s what makes it subjective, which is ironic. It’s you would the thing that you can measure is the objective thing. But the good is objective because it involves limitation. It involves acknowledging that too much of something can be bad. And too little of something can be bad. There’s, the golden mean scenario. I that relates to what it means to be created as a creature. We are not God. We should acknowledge who we are as limited finite beings… and we can embrace that. And that is good. There’s a book called, Small Is Beautiful, um, which refers to this too. Yeah. I don’t know. The most interesting or meaningful parts of my life have always been things that are non-measurable. The desire for it is an issue too. When we talk about friendship, stuff that. Um, we, and we all know that. Measuring friendship by how many likes you have, all these kinds of things, we know that there’s something inherently bad. It feels vibes, then we’re, “Oh, that’s not…” “I’m being too emotional,”. We don’t that’s right. It’s weird. We don’t trust our intuition a lot. Um, and you, I don’t know. Maybe we feel smarter because, you have a rationalist way of thinking about things. Cool. Awesome. Thanks.42:26
Melody: Yeah. And it’s being honest about our state. Yeah. Somehow, we’re all heading towards vibes everything. It’s vibes coding, it’s vibes design. It’s vibes… Maybe it’s we’re realizing that it isn’t - Yeah. Yeah. No, thanks, Henry. I feel the, I have much to think about from here. I my main takeaway from this was this idea of, that technology gives you more control and that much of a functional, healthy society is not that. Not focusing on optimizing for control and everything. Yeah.43:19
Henry: And I maybe it comes… That’s ‘cause of where we’re coming from is, we know that we don’t control because we’re, we believe in God. But it’s- I it, it, um… If you don’t believe that, what else is there left? It’s you want… You should be pursuing control, which is pursuing power. Jesus chooses to become powerless, on the cross. But then in some… The ironic part is that he, in his powerlessness, he shows his real power? You know, resurrection and everything. Yeah.44:51
Melody: What I about that is that it, uh, amplifies this contrast, again, around our… You know, maybe what it is is, where does our angsty desire for control come from? And it’s it is an anxiety? It is a do not worry? And we’re worrying and we can exert, that worry comes out and manifests itself in our idea of control. What we were saying at the very beginning, which is we this life, this set path is in our control, but it’s- It’s all a facade? Of the world. All an illusion. And that really the only thing we can hope in is in Christ.45:24