TAKE IT DOWN Act: Fighting Digital Abuse Against Non-consensual Imaging
Download MP3Dalton Anderson (00:01.016)
Welcome to Venture Step Podcasts where we discuss entrepreneurship, industry trends, and the occasional book review. Today we're to be discussing a topic that I've been wanting to touch on for a while, but the issue was there wasn't anything out there in progress to prevent these kind of grotesque acts. And now that take it down has passed the Senate and the house.
We're on track. So I was sent a video, like a YouTube video to, to be precise. I was sent two YouTube videos about how people are making and telling people to make these AI models. And when I say AI models, I don't mean AI model like chat GPT, I'm talking about like a deep fake version of somebody or a completely synthetic.
virtual human to create only fans sales funnels. And the breakdown of what they described to do was very bad. And I didn't want to talk about it because there, there was nothing preventing people from doing these things or there wasn't a act put in place to explicitly target.
deep fakes, but now there is. But basically what they described to do was to find cute women on TikTok and then take their face and then stamp it on like an Instagram model type body, and then create a profile, use the dances of this body with this person's face, then create a
OnlyFans sales funnel to get people to watch the video on TikTok and then go and subscribe to their OnlyFans to make money. And it's just such a gross thing to do, not cool, and is just disgusting. Like it's just disgusting human behavior. And I didn't want to encourage it or talk about it.
Dalton Anderson (02:25.112)
pretty much any detail. I described it a couple of times on some episodes, but I never talked about the methodology of what this person described to do. even then I'm not really going into that much detail. The guy breaks down which apps to use, which websites to go to, how to identify like a good model versus a bad model, all sorts of stuff. Like he really breaks it down and it's just,
I think with innovative technology, you'll always attract people that either are weirdos or that.
want to exploit others using this technology to have notoriety or monetary gain or attack others. And that's evident with some of the attacks on Taylor Swift, like social media platforms had to block access to typing in her name for some time while they got the deep fake videos under control.
Basically, people made deep fakes of Taylor Swift doing sexual acts and...
It was just out of control. And the only way that social media companies were able to combat this was just not allow people to be able to that name while they took down all the content.
Dalton Anderson (04:01.934)
So I'm going to get into the actual purpose of the episode. I just wanted to give you some background of why I couldn't talk about what I wanted to talk about previously because of the, there was no take a down act and yeah, I'm overall very, very bullish about this act. I think it future proofs quite a few things and covers revenge porn and deep fake foreign and anything of this such.
I think it puts us in a pretty good spot on a legal standpoint to scale with the technology that is rapidly evolving and rapidly advancing, as you can tell from these episodes and the ongoing development of everything that you see and hear online. Like, for example, my altered deep fake voice that I created.
I showed this to folks and showed them the episode and I didn't tell them that it was AI and you didn't know that was AI when you first listened to the episode and people couldn't tell. They couldn't tell that it wasn't AI until I told them. And then once I let them know that, this is AI, then that's when the whole world opened up and like, I knew it. I knew it. There was this little thing and it makes sense now. But if you don't tell,
If you don't tell the person, then it's difficult to know. And it wasn't for the normal year, you weren't able to listen into the different cues and you weren't able to tell that that wasn't me, which is scary. Like what if someone took all my voices? Please don't do this by the way. God, we're pals. But what if somebody took all my voice data?
and started saying all sorts of bad things on the internet, like X, Y, Z, like just horrific stuff. And then I lose my job or I lose my children that I have. I don't have any children at the time, right now, but I will in the future. I gotta get a girlfriend first and then a wife, but yeah, I've got plans. I'm not there yet.
Dalton Anderson (06:29.614)
That all that being said, like society judges folks on their actions and the things that they've done and the things that they do prior. And there are certain things that you just won't be able to get away from. And doing bad things typically has ramifications. And if I'm saying bad things on the Internet, people will be like, oh, well, I don't want to associate myself with Dalton because he's a bad person because I'll look at all the stuff he says on the Internet.
But what if that stuff is fake and not me? But it's you can't tell that it's not me. Like if somebody puts a video and my voice lip syncs it.
I just really difficult. It's really difficult to to know. Like, is it me or not me? And that's kind of the gist of what take it down act is about. So the take it down act is about. Revenge porn and sexual acts related to deep fakes. And this has since been defined in the NCII rule, and basically it defines what what is non-consensual imaging.
So if you're in an intimate relationship with somebody and you decide to film or take photos of each other, you've consented to that by just your, just your relationship. And obviously the person that is hopefully in there, if they don't have consent and that you can't take video people, but just let me just, let me rephrase that or phrase that really wrong or not wrong, but like just phrase it in an odd way.
Basically, if you're an intimate relationship and you agree to take videos and photos of each other, that's fine. But just because you gave consent for private photos or videos to be taken and recorded and then stored, that doesn't consent to publication of those videos. And people you've seen in the past have been victim of
Dalton Anderson (08:42.776)
publication of private information or private sexual acts. And most of the time the victims of this thing are these acts are women. And then there's this other emerging risk of deep fakes in the, in the advancement of AI where people are able to go on websites that allow you to undress somebody or
take a photo that was not intimate, so you don't have to have an intimate relationship with that person. And then you can undress that person and turn it into an intimate image or a nude or a sexual act that that person, one, is not aware of, did not consent to. And the photos that you can use as reference points to create these sexual acts or images,
aren't of sexual nature. Like you could be at dinner or you could be just taking a photo of yourself, like walking around in the mall, whatever it may be. That is the emerging risk that has happened throughout, I would say like two years. There's been some high profile victims, especially high schoolers that I'll get into later in the episode.
So basically the NCII did not include deep fakes. And so now there's this take it down act after these high profile events with, especially with women or I guess I know women, but girls in high school where this has occurred and social media companies took a long time to take down these digitally altered images.
and the case with one victim, took Snapchat nine months to take down the sexual image. And that resulted in all sorts of issues with the victim in high school and just distractions. And I mean, it's just not a good situation to be in when you're scared that there's fake nudes circulating around your high school or around the whole internet of
Dalton Anderson (11:08.896)
you or your friend or anyone else.
So the rise of deep fakes and you may be thinking about it like, Oh, I've never heard about a deep fake. What is a deep fake? Who even does that kind of stuff? That's not even popular. I haven't even heard about that. Okay. Fair enough. You are not around weirdos. So pat yourself on the back, but these deep fake websites do get a lot of visits. I have in here, the top 16 sites in six months got
200 million visitors. These are estimates from a legal case that is ongoing, these numbers were a result of an investigation. So the top 16 sites had 200 million people going to it in six months.
So it's a little bit more mainstream than you would think. I don't know these sites. I've never been on a site like this. I know that they exist and I know that on certain internet forums, when they're talking about these new AI models, like there's weirdos in the thread, like, like, can you do sexual stuff with it? And we're like, what are you talking about? Like we're talking about the architecture of the model and how they trained it. What was their training recipe? And then there's like somebody in the corner like,
Oh yeah. Like, but can you like make AI chicks or like, is there like an AI girlfriend I can make? Like I would love an AI girlfriend and everyone else is like, get out of here. You weirdo. Like go somewhere else. And so I knew these things existed. I just didn't know one, I didn't know how popular they were. And two, I didn't know.
Dalton Anderson (12:57.23)
that like this many people went to them. Like it's crazy. One thing I did find concerning is I read this out last year. There are more than 21,000 deep fake videos uploaded to pornographic sites, a 460 % increase from last year or two years ago. So year over year, 460 % increase in
deep fake videos uploaded online.
I don't think I defined what a deep fake is. So deep fake could be anything. So I would say any digitally altered asset could be considered deep fake. So when I made my AI voice, that would be considered a deep fake because I...
Dalton Anderson (13:52.814)
I did not say those things, nor was that me, but it sounded like me, acted like me, had my same tone of voice, cadence of speech. And people had a hard time telling the difference because it was a deep fake. But I made the deep fake so I was aware of it and I consented to it. So by definition, it wouldn't be a deep fake. But if someone unknowingly to me made
a deep fake of me saying inappropriate things on the internet. That would be a deep fake. But the issue is, just, it's a new technology and there's not a lot of synthetic keys that are built into the audio or the images or the videos. One company that does this very well is Google, where they bake in a synthetic ID into the pixel structure of the image or video that was generated. Cause a pixel,
are basically numbers that go up to 256 and these pixels, you can create codes in them without changing or altering the quality of the image. And so if you create an image from Google, Google knows when they scan the internet that it's an AI photo. These things aren't created at the moment for a whole bunch of sites. And so these deepfake sites, like you won't be able to tell where it came from. It's kind of like an unregistered gun.
and Google has a serial number on their guns, these other websites not so much. But yeah, that would be a deep fake. So it would be a digitally altered asset without the person's consent. And so you could do sexual images. can create and distribute impersonations of that person. You can create videos of them doing stuff that
is inappropriate, all sorts of things that would attack or undermine that person's credibility or have them unknowingly make decisions without their knowledge. I mean, this would be strip fraud, but Fidelity has voice authorization as their authenticator. if you're able to authenticate saying your, I think all they ask is your name.
Dalton Anderson (16:21.61)
And then after that, it allows you access to the systems. It says thank you for authenticating and it just boom, you're in. So I think you just have to rethink quite a few of those authentication methodologies. But anyway, so that was deep fakes.
Dalton Anderson (16:43.736)
The next thing is...
Getting confused here.
Dalton Anderson (16:52.312)
The next thing is defining what the take it down act comprises of. So I talked about the NCII and the NCII was defining the revenge porn or what is consent for intimate relationships with it related to videography or images. And then the next thing is
Dalton Anderson (17:15.278)
revenge, the revenge porn or the publishing without consent because just because you had an intimate relationship where you guys consented to images or videography being taken while you're in that relationship, it did not consent to publication. And so if that were to happen and there was no explicit thing in writing saying, Hey, I consent to publication, then that would be defined as revenge porn.
where as deep fakes, they don't have prior background or consent. As I mentioned before, you can take an image of non-sexual nature and make that into an image that is related to a sexual act.
Dalton Anderson (18:07.938)
And so the Take It Down Act has a different approach and.
The approach is related to the distributors of these models that allow you to quote unquote undress people and make these deep fakes. And then there's also a agreement with social media companies like Meta platforms and Snapchat to commit.
and have an obligation for removal of content after a good faith statement was made by the person who's filing the complaint that this is a deep fake, they have to take it down immediately or within 48 hours. 40 hours is the deadline and if not, they'll be held liable for that. And then in addition to that, there are also
attacking the distributors or the creators of this AI models. And this act tried to get something similar to this was approached and failed to pass the house before. And it was called the Defiance Act. And the Defiance Act was similar to the Ticket Down Act, but it was more of allowing the victim to have
restitution regarding deepfakes and revenge porn at a federal level, whereas the Ticket Down Act focuses more on the deepfakes and the NCII definitions of consent to...
Dalton Anderson (19:54.488)
videography and photography in an intimate relationship and defining what that, what those limits are. Like if, if you do consider that no publication, of course, unless explicitly in writing. If
Dalton Anderson (20:10.9)
the Defiance Act passed, and it would cover the most of this, but it just didn't pass the House. And that was, think,
Alexandra Cortez was the one who led that. The Ticket Down Act is led by Ted Cruz and mostly because one of the key victims in this whole ordeal was Ellison Barry and she was based in Texas. She was a freshman in high school and one of her classmates unfortunately decided that it'd be funny to create a deep fake of her using an image that she posted on Instagram.
And the image of her on Instagram was kind of like the one that was used, because it showed it in the video of not this sexual image, of course, but the normal, like original image, the source or reference image was a dress she was wearing. I think she was going to dinner with her parents and like she's staying somewhere like near the beach or something like that. And they took that image and then undressed her and then distributed it throughout the school.
And obviously that's like a brutal thing to happen. And especially when it takes nine months to get it taken down. And I would like to show a video of her later on, cause I think she, she has really,
Dalton Anderson (21:39.544)
Like, let me find the right word for it. Like she's taking the right approach on this. instead of having, which she could do that however she wants, but instead of like being embarrassed and upset, she is taking a different approach where she's attacking the problem and speaking and she like.
is speaking about the bill and was one of the key advocates of this bill. It wasn't her parents, it was her. Like she's speaking, like the stuff that she talks about and her approach and how well-spoken she is, I think is like a great representation of her and her personality and her character. So I wanted to just display a video of her speaking about the subject, but I'll do that later in the episode.
And then there was another person named Francesca Manny, who was based out of New Jersey. Same, same issue that happened with her. She's also been a key representate, like a key representative of this issue and an advocate. She was on 60 minutes with her mom.
And so those two folks or those two girls that were affected were both in high school and both had similar issues where they had difficulty taking down the image and it was up for too long and it's not fair and we need to figure this out and there's something's got to change. And so that's, that's where that came from.
Dalton Anderson (23:25.346)
The next thing was talking about like what happens if you do violate the NCII like law that now covers both revenge born and synthetic.
publication of pornographic images or videos.
Dalton Anderson (23:49.742)
I think that these rules are a little bit...
I think that they're a little bit more relaxed than I would expect them to be, but I think that it's...
think what they're really trying to get after is attacking the source instead of the people, but I don't know. I just feel like they're a little lax, like penalties, fines, forks, juries and restitutes, imprisonment up to two years for adults and then three years if the victim is a minor.
So like two to three years.
I don't know.
Dalton Anderson (24:36.398)
I don't know that I just don't feel like that's enough. Like, I mean, you should know not to do these things. You should know, especially if it's an adult that does that to a minor, like that's crazy. If a minor did it to a minor, like I can understand that, you know, it might've just been like a badly placed joke or like some kind of bullying going on.
And I think there should be some kind of flexibility there. But I, yeah, I mean, if the victim is a minor and like you're an adult, their penalties should be very hefty on any of those things. And maybe maybe this doesn't include the other charges like you would become, you know, a on the public records for.
all sorts of things that, and you can't live near schools and all sorts of other issues. But yeah, I was like surprised about the two to three years. It just didn't seem like that was enough. anyways. Okay, so I touched on a little bit earlier was the key provisions in the Take It Down Act and
biggest one was within 48 hours, there needs to be action taken and the content needs to get taken down. And the only thing that the victim would have to provide is a good faith statement. They don't have to provide a whole thing. They don't have to felt like it's some kind of form. All they have to do is provide a good faith statement like, Hey, this is me. I did not consent to this. This is a deep fake or this is, you know, an intimate image that I
did not provide explicit statement and writing for publication. mean, you could say it differently, I would be freaking out if this happened to me. I would be like, this isn't me, whatever, take it down. I gotta just be geeking out probably, I don't know.
Dalton Anderson (26:50.178)
But the essence of it is if a platform doesn't take it down in time, then there will be fines and investigations and sanctions. the body that oversees that take it down is the FTC. And if the FTC determines that you are acting in non-compliance, so if it
isn't taken down in 48 hours. They're considered as noncompliance and then the FTC will launch an investigation. And when the FTC launches an investigation, most of the time, I would say like 98 % of the time, they'll take you to trial and they'll win and you gotta do whatever they tell you to. So don't wanna do that if you're a social media company because the FTC does not play around.
Dalton Anderson (27:53.23)
Overall, it did have sponsorship from Metta and Snapchat. I'm pretty sure they've already built systems and they have things put in place to report images, but I don't think they have an option to like, okay, this is a deepfaker. This is revenge born. I'm going to take it down like right away. And I think the issue with social media at the moment is
regarding these extreme things is they do like a crowd report where if many people from a crowd report an image is bad or not so good, then they'll take it down. But if it's related to something extreme like fake nudes or something like that, and there's not that many people reporting it, then it just doesn't get taken down. I've helped, I've helped women that I know
or girls that I grew up with where people would create these like fake accounts under their name and they would take all their photos from their Instagram, publish them, put it on their Instagram and they would start getting followers. Like they would have thousands of followers at some point and they just pop up and then people are like, Hey, is this you? Like, I didn't know you made a new account. And they're like, I didn't make a new account. Who is this? And we all had to
basically go in and she'd have to tell all her followers and DM people like, Hey, like there's this account that's fake. It's not me. Instagram is refusing to take it down. Like, can you help me report this? And of course I would report it every time. Like that's, it's an easy thing to do.
what the fact that you have to take it to that level and you have to reach out to other folks for them to also report the image for them to take it down is ridiculous. Like if somebody makes a fake account of you and they're, they're starting to mimic you online, get followers, and then they start publishing like sexual stuff, Instagram should just take it down. Meta platforms should just take it down. Like the,
Dalton Anderson (30:07.842)
you should give people the benefit of the doubt that there is in the verification process should be pretty easy. Whereas, you know, this account has the original date of creation and then this other account like is way older, but maybe they recently had a name change and they deleted all their photos and they re-uploaded all these different photos and then the photos match. I mean,
it's not that hard of a problem to verify which one is new and which one is old and verify these people's claims. And so I just, never understood why Instagram or met or, or Facebook just took so long to take down these things. And the same thing applies for Snapchat where people would create these Snapchat accounts of like women I was friends with. I mean, they're women now, but back in the day of the girls,
And then they would add people that were related to these people, like related to their friend group or people in high school. And then they would like start messaging people and then sending nudes and all sorts of stuff.
Dalton Anderson (31:19.322)
and you had to get your whole friend group and get a whole bunch of folks involved to report it and get the count banned. And if you didn't have that kind of voice, then you would be shit out of luck. Snapchat and Meta, they don't care. And they'll just let it fly. Instead of taking people at their word and saying, hey, like this is...
not right, somebody is mimicking me online and now they're doing sexually inappropriate stuff, sexual or inappropriate stuff. That's not me, take it down. That's it. I don't know, it's just so many things. There's so many examples where if like, what do you need, you need 10 people to tell you that your house is burning down?
not enough people told me. So I didn't know. I didn't know my house was burning down. OK, well, like, you know, we were at seven, but we really needed 10. We needed 10 people to tell me my house is burning down. Like, why? Why would somebody just randomly go up to him like, hey, man, like your house is burning down. You got to, you know, like figure it out. You know, like.
In a general sense, like society in itself is wholesome. There are some bad actors and that's promoted by the media and a lot of things that the media talks about is very negative in nature and, and frantic and invokes fear because that's what makes money. But in a general sense, if you just walk around the world or where we go, I mean, a lot of people are quite nice and
randoms in various countries that have less money than you or places where you're known or you're just walking around. mean, it's just like people aren't out to get you. It's just people may feel that way because of the things that they watch on television.
Dalton Anderson (33:37.016)
What I talked about a little earlier was the face swap proposition. this is a weird gray area and the take it down act addresses this. But basically if I take someone else's face and plaster it and stitch it onto somebody else's body, then technically it isn't anybody. Like it's just, it's like a fake person, but.
Then this is when the definition from the Take It Down Act is put into perspective and it is, will a reasonable person be able to tell that this is a different thing? And if the answer is no, then it qualifies for the Take It Down where you could take it down.
So if a reasonable person is not able to recognize or acknowledge that this is a deep fake or recognize that this is a related person to this person A versus person B.
Dalton Anderson (34:53.992)
That is what the definition is of the composite of the image or video is artificially generated comparing to person A to person B. Can you relate those two people? And if you can, then it's a deep fake. So I think that would cover quite a bit of things because if it's not an actual image of the person, like this undressed thing,
And it's like a digitally altered version of their face, even though it's not their face, it's close to their face. If a reasonable person can't tell the difference between the two people, then it's a deepfake. Which I think protects, like that kind of clause protects future advancements in the technology as this grotesque technology evolves.
Dalton Anderson (35:53.422)
And the practicality of this is going to be a huge task where there is just going to be just a monster undertaking of attacking these websites that are generating all of this.
undress me content or deep fake pornography. I mean, as I mentioned earlier, two years ago from like year over year from last year, up 460 % over 20,000 images or 20,000 videos were uploaded to pornographic sites for deep fakes. And
It's going to be hard. It's going to be hard to identify what is a deep fake that's attacked, especially on a normal porn site or something like that. I think it'd be difficult to understand this is a deep fake. The TikTok example that I mentioned earlier in the episode during the intro makes a little bit more sense simply because this person is a normal person and
They are doing these sexual dances and creating a sales funnel to OnlyFans. And so it's easy to link those three things together, like the original person, the deep fake person doing the dances that this person's never done, and then that AI model, then linking to the sales funnel for OnlyFans. Those connections are easy to create.
but if it's like a deep fake, like just out of nowhere, just uploaded to Pornhub or something like that, then.
Dalton Anderson (37:43.31)
I don't know how you would track that down unless people have built these apps that scrub the internet. And that might be a thing where people create apps where they scrub the internet for deep fakes of you and what you're doing and whether or not you're okay with it and how to take it down. And maybe there's these whole consulting firms that are created just so they help manage your digital
presence in this deep fake world and they'll take down things or bribe people to remove stuff or whatever it may be. But I think those are just additional work towards solving the problem versus attacking the actual problem. Like taking down these big companies and then putting laws together that that's what we're doing at the moment, but putting laws together that make the generation of deep fakes illegal. And then also putting laws together that
Cause you won't be able to prevent people from creating these deep fakes. Like it's going to be impossible to prevent completely prevent it. But what I could see happening is
As I mentioned earlier, like each gun has a serial number on it. And if you're caught with a gun with no serial number, like you'll get in a lot of trouble.
Dalton Anderson (39:08.846)
So if you put a serial number on these AI images and videos that you create, this digital synthetic key that something that Google does within the pixels, and it has this synthetic key that's generated, it's underneath the, it's within the generation of the photo at the pixel level, then you will be able to tell what is AI generated and what is real. If you made a law that forced every AI model company that
provides imagery and video to or anything AI related outputs that is imagery or video. Because you can also like do like alter an image or video that you assign a synthetic key to it. And then that synthetic key is linked to the information that would be distributed on the Internet. And if you have majority of the major
websites registered with a synthetic key, then it's easier to tell which websites are not registered. And so this is quite complicated because you would need an alliance with the private companies and then you'd probably need some kind of world alliance to get a registration of all of the main AI companies to create the synthetic key upon the generation of an image or video.
Dalton Anderson (40:41.006)
If that is doable, I think that's the best result because then it's easy to track down what companies are not generating synthetic keys. If you have the majority of companies, like if you have compliance at 80%, instead of looking around and trying to see, what's the whole market and who can generate these things. Now it's a simple problem where you're only looking at the remainder of the 20 % of the hundred.
So I think it narrows down the issue, at the moment, they're attacking the top 16 websites.
Dalton Anderson (41:21.264)
And that's an ongoing development.
But as I mentioned, it's just that these AI models are just gonna become more sophisticated and more capable over time. And it seems as every six months or so, or every three months, there's always some kind of historical breakthrough where it's like, wow, this is crazy, this is next level. And then three months later, there's another breakthrough that just blows your mind. And then that happens over and over and over and over again, over a period of.
three or so years and before, know, you don't even know where you were standing prior. And that's what's really been going on lately. And so I don't think that things get less advanced. I they get more advanced. So I'm really happy with what has been passed and I'm a hundred percent sure that this is going to get signed. I think that there's only two people that voted against it. And I think it was something like 400 and
I should know the count for sure. But sometimes I get a little little nervy on the podcast. 409 to 2. So two people voted against it and then 409 people.
Dalton Anderson (42:42.658)
So has strong bipartisan support, of course. I mean, I don't know why you would vote against it.
But overall, very happy with the application of this and how this is moving forward. And as it close out, I would like to share a video.
of
one of the victims because I think she represents herself in a wonderful way. So let me share my screen.
Dalton Anderson (43:20.962)
Alright, let's do this. So full screen.
Dalton Anderson (43:28.76)
is this.
This isn't it. Hold on. think it was autumn. It was playing videos why I was. my gosh. It was playing videos probably while I was.
Dalton Anderson (43:54.103)
Okay, here we go.
Dalton Anderson (43:59.022)
The video is loading up.
Dalton Anderson (48:02.798)
So that was the video that I wanted to share. And I thought that it represented her in a wonderful light, like taking something that happened to her that was very unfortunate and being an advocacy, advocating for change and creating a lot of pressure on politicians to create mechanisms that don't allow this to happen to other people, which
has been successful. Same thing with Francesca. And overall, just a very bad thing that has happened to both of those girls. And it's very unfortunate, but they have spoken out and were very strong and have since...
impacted the legislation that has now passed and is not signed by the president, but it's passed the House and Senate. And I don't see why he would not sign. So I just wanted to create that last note where it's just refreshing that.
Dalton Anderson (49:23.042)
These things are changing and the concern that women and girls have had for since even when I was a kid, like this has been an issue since I've been a kid. Like when I was growing up, this was an issue. And it's obviously the issue hasn't gotten any better. It's just gotten worse over time. And I'm happy that these things are changing and I'm happy that the
impact that these two girls have had on shaping and pressuring politicians to create a bill that is passable and has bipartisan support and getting this through ASAP. Because this happened about 15 months ago. I don't know when things were...
published in public, but I am very glad that things are moving forward in a positive direction.
That being said, probably should have started at the beginning of the episode of that video. Maybe you probably didn't make it to the end, but yeah, I just got distracted. But of course, wherever you are in this world, have a great day. Good afternoon, good evening. And I hope to hear from you. I hope you hear from me next week. Sorry. But anyways, goodbye.
Creators and Guests
