YTread Logo
YTread Logo

Facial Recognition: Last Week Tonight with John Oliver (HBO)

May 31, 2021
Our main story

tonight

concerns

facial

recognition

, which ensures that my iPhone won't open unless it sees my face or the face of any toucan, but that's about it. Facial

recognition

technologies have been shown in television shows and movies for years. Denzel Washington even found a creative use in the 2006 action movie deja vu, we have

facial

recognition software, yes, let's use it on the Cross Master bag: all the bags on the south side of the city in the 48 hours before the explosion , TRUE? that's ever been used this way, same day bingo, bingo, in fact, Denzel with clever, believable plot development like that, frankly, it's no wonder deja vu gets such rave reviews on IMDB as a insult to anyone who finished grade school worst movie of all time more like leaves ARPU and my personal favorites a one star review saying bruce greenwood as always he's great and very sexy and there's a cat that survives a review which was clearly written by bruce greenwood or that cat now the technology behind facial recognition has been around for years, but recently, as it has become more sophisticated, the applications have expanded greatly.
facial recognition last week tonight with john oliver hbo
Freezers are no longer just humans who can be targets. The i farm sensor scans each fish and uses automatic image processing to uniquely identify each individual. A number of symptoms are recognized, including the loser fish, yes, the loser. fish, which by the way is an actual industry term now that the company says it can detect which fish are losers through a facial scan, which is important because can you tell which of these fish is a loser and which is a winner? Are you sure about that? because they're the same fish, that's why you need a computer, but the growth of facial recognition and what it's capable of doing brings with it a host of privacy and civil liberties issues because if you want to get an idea of ​​how scary this could be technology if it becomes part of everyday life, just watch how a Russian TV presenter shows off an app called find face.
facial recognition last week tonight with john oliver hbo

More Interesting Facts About,

facial recognition last week tonight with john oliver hbo...

Yes the rule if you meet an attractive girl in a cafe and you don't have the guts to approach her no problem all you need is a smartphone and the cara fine app find new friends take a photo and wait for the result now you are already looking at her profile page burn it all burn it all I will realize that this is a phrase that no one participated in The app ever thought, but imagine that from a woman's perspective you are going about your day when suddenly you receive a random message from a guy you don't know saying hello.
facial recognition last week tonight with john oliver hbo
I saw you at a cafe earlier and used the Find Face app to learn your name and contact information. I'll pick you up at your house at 8:00, don't worry. I know where you live, but one of the biggest users of facial recognition is perhaps unsurprisingly the police. Since 2011, the FBI has logged more than three hundred and ninety thousand facial recognition searches and the databases they pull from. Law enforcement agencies include more than one hundred and seventeen million American adults and include, among other things, driver's license photos of residents of all of these states, so about one in two of us have had our photo taken.
facial recognition last week tonight with john oliver hbo
Search this way and the police will argue that this is all for the best. In fact, here's a London police official explaining why they use it. They are here in London. We have attacked London Bridge. Westminster Bridge attacked the suspects. involved in that people were guilty of those crimes were often known to the authorities if they had been in some database if they had been caught on camera beforehand we could have been able to prevent those atrocities and that would definitely be a worth price pay, okay? It is difficult to speak out against atrocity prevention. This program is and always has been anti-atrocity, but the key question is what is the compensation if the police cannot guarantee that they will be able to prevent all robberies, but the only way to do so is. by having an officer stationed in every bathroom watching you every time you take a I'm not sure everyone agrees it's worth it and the people who do might want it for reasons other than crime prevention and now, of course In fact, it's a very good time to be looking at this issue because there are currently serious concerns that facial recognition is being used to identify Black Lives Matter protesters and if that's true, it wouldn't be the first time, as this Google senior scientist Timothy Brew will tell you that there was a For example, the Baltimore police and the Freddie Gray marches, where they use facial recognition to identify protesters and then try to link them to their social media profiles and then arrest them, so right now a lot of people are urging people not to put images of protesters on social media because there are people whose job it is to simply find these people and attack them.
That's right, join the Freddie Gray protests. Police officers used facial recognition technology to search for people with outstanding warrants and arrest them. which is a pretty sinister way of undermining the right of assembly, so

tonight

let's look at facial recognition. Let's start with the fact that even when big companies like Microsoft, Amazon and IBM have been developing it and governments around the world have been happily implementing it. There haven't been many rules or a set framework for how it is used in Britain; In fact, they've been experimenting with facial recognition zones, even putting up signs alerting you to the fact that you're about to enter one that seems polite, but what about when a man decided he didn't actually want to have his face scanned? expensive?
This man didn't want to be caught on police cameras, so he covered his face but they stopped him. They photographed it anyway. An argument ensued. What is your suspicion about him? the fact that he passed, I would do the same, it's a cold day and we found out that the police offices I'll sweep the bomb, so I'll get up again, I'll sit down with my cough. I've got 90 quid right there you go look at that no effects next 90 quid well done yeah that guy Ritchie's character was rightly angry about that and by the way if you're not British and You're looking at that man, then at me.
I wonder how we both come from the same island. Let me quickly explain. The British have variations so emotionally stunted that they are practically comatose and gleefully tell large groups of police to go away and make one if they are going to take a photo. With my face, there is absolutely nothing between the two and the UK is not the only one building a system. Australia is investing heavily in a national facial biometric system called capability, which sounds like the name of a Netflix original movie, although it's actually perfect if you want people to notice it, think it looks cool, and then forget it ever existed and You don't have to imagine what this technology would look like in the hands of an authoritarian government because, unsurprisingly, China is adopting it in a big way for more countries. in the soie we can match each face with an ID card and track all your movements a

week

ago we can match your face with your car match you with your family members and the people you are in contact with with enough cameras that we can know who you are with That's often a terrifying level of surveillance, imagine the Eye of Sauron, but instead of searching Middler for the One Ring, he was really interested in where all his orcs and some developers funded by him like to go to dinner. the state in China seem strange. oblivious to how sinister their projects in China sound, what is it that Terminator is one of the war founder's favorite movies, so he plays for you with the same name, but they want to prove something good in the list system?
So, in Terminator Skynet, evil reigns over death. from the sky, but in China Skynet is good, yeah, there's no difference, oh, that's the difference, you know? It's not exactly reassuring that you named your huge, all-encompassing AI network Skynet, but it's a good take because it would be like the Today Show built a journalist robot and named it Matt Lauer, but hey, yeah, this one is completely different, sure you also have a button under your office desk, but all it does is release a lilac air freshener. This is the good version. The point is that this technology raises troubling philosophical questions. questions about personal freedom and at the moment there are also some very immediate practical questions because, although it is currently being used, this technology is still a work in progress and its error rate is particularly high when it comes to matching faces in real time In fact, in the United Kingdom, when human rights investigators watched police test one such system, they found that only 8 out of 42 matches were verifiably correct—and that's before even getting into the fact that these systems they may have some worrying and boring points like one from MIT.
The researcher discovered this by testing numerous algorithms, including Amazon's own sight recognition system. MIT researchers Joy Bull and Weenie say the overall accuracy rate was high even though all companies were better at detecting and identifying men's faces than women's faces, but the error rate grew as went deeper. The lightest male faces were the easiest to guess the gender and the darkest female faces were the most difficult; the system couldn't even detect if she had a face and others misidentified her gender. white boy no problem yeah white boy no problem which one is the unofficial one. story motto, but it's not like what we needed right now was for computers to somehow find a way to exacerbate the problem and it gets worse in a test.
Amazon's system even failed on the face of Oprah Winfrey, someone so recognizable that her magazine only had to type. the first letter of her name and her brain automatically fills in the rest and that's not all. A federal study of more than one hundred facial recognition algorithms found that Asians and African Americans were up to one hundred times more likely to be misidentified than white men. That's clearly concerning, and on top of all this, some law enforcement agencies have been using these systems in ways they weren't exactly designed to be used in 2017. Police were looking for this beer thief.
The surveillance image was not clear enough. for facial recognition software to identify him, so police used an image of a lookalike who turned out to be actor Woody Harrelson, which produced the names of several possible suspects and led to an arrest. Yes, they used a photo of Woody Harrelson to catch a beer thief. and how dare you drag Woody Harrelson into this? This is the man you got drunk with at Wimbledon in this magnificent hat who made this facial expression in the stands and by doing so accidentally made tennis interesting for a day and he doesn't deserve prison for it. deserves the Wimbledon trophy and there have been multiple cases where investigators have been so confident in a match that they made disastrous mistakes a few years ago, Sri Lankan authorities mistakenly named this Brown University student as a suspect in a Heinous crime that led to some pretty horrible endings.

week

morning of April 25th in the middle of the

last

season I woke up in my bedroom 2:35 missed calls all frantically informing me that I had been falsely identified as one of the terrorists involved in the recent Easter attacks in my beloved homeland Sri Lanka , that terrible week of finals is bad enough, what with staying up all night alternating between 5-hour power grabs and monster Java beams while you try to force your brain to remember the differences between baroque and rococo architecture without waking up to find out Evo has been accused of terrorism because a computer sucks faces.
Now, on the one hand, these technical problems could be fixed over time, but even if this technology eventually becomes perfect, we should ask ourselves to what extent we are comfortable with it. being used by the police, by governments, by companies or by anyone, and we should ask ourselves that right now because we are about to cross an important line. For years, many technology companies approached facial recognition cautiously; In fact, in 2011, the then president of Google said it was the only technology that the company had retained because it could be used in very bad ways and think of it as: Pandora's Box and for Silicon Valley, the openers of Pandora's Box more enthusiasts from around the world and even some of the large companies that havedeveloped.
Facial recognition algorithms have designed it to use unlimited data sets, like mugshots or driver's license photos, but now something big has changed and it's thanks to this guy, wonton tat and his company Clearview AI, and I'll let him describe which it does quite simply. Clearview is basically a face search engine, so any member of law enforcement can upload a face to the system and it finds any other publicly available material that matches that particular face. So the key phrase is publicly available material because Clearview says it has compiled a database. of 3 billion images which is larger than any other facial recognition database in the world and is made by pulling them from public social networks such as Facebook LinkedIn Twitter and Instagram, so, for example, Cliff Hughes' system theoretically would include this publicly available photo of himself at what appears to be Burning Man or this one of him wearing a suit from Santa Clau's exclusive nighttime collection at Men's Wearhouse and this very real photo of him shirtless and lighting a cigarette with his hands covered of blood, which by the way is his profile photo. no title because yes, of course, he is also a musician.
I can only assume that's the cover of an album called autojump if this ever shows up on a Pandora station and Tom gets a tattoo of the willingness to do what others haven't been willing to do and that gets removed. The entire Internet for photography has made this company a real game-changer in the worst way possible. What impresses our journalists running a sample search, so here's the photo you uploaded to me mm-hmm, a photo from CNN mm-hmm, so the first images were found, a few different versions of that were found same image, but now as we scroll down, starting to see photos of me that are not of that original image, oh go to it.
Wow, my goodness, this photo is from my local newspaper where I lived in Ireland and this photo would have been taken when I was like. 16 Wow, that's crazy, yeah, okay, here's some advice though. If there is an embarrassing photo of you from when you were a teenager, don't run away from it, make it the center of the promotional campaign for your television show and own it. Use the The fact that your teenage years were a hormonal Stalingrad taps into the pain, but the idea that someone could take your picture and immediately find out everything about you is pretty alarming even before you discover that more than 600 law enforcement agencies have been using the Clearview service and you are probably in that database, even if you don't know it, if a photo of you has been uploaded to the Internet, there is a good chance that Clear View has it, even if someone uploaded it without your consent, even if you removed the tag or then set your account private and if you're thinking, wait, isn't this against the Terms of Service for internet companies?
You should know that Clearview actually received cease-and-desist orders from Twitter, YouTube, and Facebook earlier this year, but has refused to stop arguing that it has a First Amendment right to collect social media data, which is not at all how the First Amendment works, you could also argue that you have an Eighth Amendment right to dress up rabbits like John Lennon, that amendment doesn't cover what I think it does and yet TomTom insists that all of this It was inevitable, so frankly we should all be glad he was the one to do it. I think the choice now is not between no facial recognition and facial recognition, it's between, you know, bad facial recognition and responsible facial recognition. facial recognition and we want to be in the responsible category, sure you want to be, but are you doing it?
Because there are a lot of red flags here to begin with. Apps it developed before this included one called Trump Hair that would simply add Trump's hair to a user's photo and another called vidi ho that caught its own users by tricking them into sharing access to their Gmail account and then spanning all his contacts, so I'm not sure I'd want to trust this guy with my privacy if I were looking for someone, though. create an app that would let me put Ron Swanson's mustache on my face while my checking account silently depleted, then he would surely be at the top of my list, and despite his repeated assurances that his product is intended only for law enforcement agencies, as if that's intrinsically good that you've already put it in the hands of many other people because, in addition to users like the DEA and the FBI, you've also made it available to Kohl's Walmart and Macy's employees, who by They alone have completed more than 6,000 facial searches, and it gets worse because they also reportedly tried to offer their service to congressional candidate and white supremacist Paul Nealon, suggesting they could help him use unconventional databases for extreme opposition research, which which is a terrifying set of words to share a prayer with the white supremacists who are now on the cliff. who says that offer was unauthorized, but when asked who else he might be willing to work with, Tom Tats' response was not reassuring: there are some countries that would never sell to that on very adverse terms for the US. , for example, like China and Russia, Iran.
North Korea, so those are the things that are definitely off the table and about countries that think being gay should be illegal, it's a crime, so like I said, we want to make sure that we do everything correctly, focusing primarily in the US in Canada and the interest has been overwhelming, to be honest, so much interest that you know you're taking it one day at a time, yeah, that's not very comforting when you ask a farmer if they would let in the foxes to the henhouse, the answer is I hope you don't know, the interest from the foxes has been overwhelming, to be honest, there is so much interest so you know we are taking it day by day and, as expected, the reporters from BuzzFeed discovered that Clear View has quietly offered its services to entities in Saudi Arabia and the United Arab Emirates, countries that view human rights laws with the same level of respect that Clear View seems to have toward Facebook's Terms of Service, for While facial recognition technology is already here, the question is what can we do about it.
Some are trying. to find ways to thwart cameras hey guys it's me Jillian again with a new makeup tutorial today's topic is how to hide from cameras well first of all it's probably not a scalable solution and second of all I'm not sure if that makes less identifiable or the The most identifiable person in the world, officers are looking for a young woman, dark hair, medium build, looks like a mine, who went through a crusher. Look clearly, what we really need to do is put limits on how this technology can be used and some places have laws.
San Francisco already banned facial recognition

last

year, but its scope is limited to city law enforcement and does not affect state and federal use or private companies, while Illinois has a law requiring companies obtain written permission before collecting a person's facial fingerprints. scans or other biological identifying characteristics and that's good, but we also need a comprehensive national policy and we need it right now because again there is concern that it is being used in the protests that we are seeing now and the good news is that only this The Last week, thanks to those protests and two years of work by activists, some companies withdrew from facial recognition, for example, IBM says it will no longer develop facial recognition, while Amazon said it would suspend work with facial recognition for a year. authorities and Microsoft. said it wouldn't sell its technology to police without federal regulation, but there's nothing stopping those companies from changing their minds if public outrage subsides, and for the record, while Clearview says it's canceling its private contracts, He also said that he will continue to work with the police just as he will continue to collect your photos from the Internet, so if Clearview is going to continue taking our photos at least, there may be a way to let them know what you think about it, so if next Maybe you feel the need to upload a photo, maybe include an extra one for them to collect, maybe hold up a sign saying that these photos were taken accidentally and I would prefer you not look at them, or if that seems too complicated, just Clearview, That really gets the message across. and remember that these photos are often searched by law enforcement, so you may want to take this opportunity to talk to the investigators who review your photos, maybe something like I don't look like Woody Harrelson, but while have your attention, defund the police.
Whatever you feel is most important, tell him. You must put up a sign. That's our program. Thanks so much for looking. See you next week. Good night.

If you have any copyright issue, please Contact