The title of this article seems to imply that face painting was intentionally used among Juggalos to deter computerized facial recognition - of course we know that's not the case. Though to be fair, somewhere in the history of the practice, it may be argued that it was to throw off human face-recognition.
This reminds me of a sort of creative look into futuristic fashion I saw a bit ago:
The world is filled with people that figured out things accidentally because they were doing something else.
For example, the person who figured out how to heat food with microwaves (and created the first microwave) didn't set out to do so, he discovered that the chocolate bar in his front pocket melted while working on magnetrons.
"Figuring something out" doesn't imply concerted research towards that goal, accidental understanding happens all the time to all of us on many topics.
> For example, the person who figured out how to heat food with microwaves (and created the first microwave) didn't set out to do so, he discovered that the chocolate bar in his front pocket melted while working on magnetrons.
Willis Carrier was hired to build a machine to prevent the humidity in a room from making inks bleed at a magazine. He built the machine, which pulled moisture from the air by pulling them onto cooled coils, only to discover that the room dropped significantly in temperature.
He ran with it, and eventually founded the Carrier air conditioning company.
RF heating was understood well before the development of the cavity magnetron; Westinghouse demonstrated RF cooking at the 1933 World's Fair using a standard shortwave transmitter.
The chocolate bar story is attributed to Percy Spencer in 1945 and is probably somewhat fanciful - anyone working around high-power radar equipment would necessarily be aware of RF heating, because it presents a significant safety hazard. Spencer can legitimately claim to be the inventor of the microwave oven and is named on Raytheon's patent.
I've had that happen before. I was gifted a Lindt chocolate bar for my birthday, put it in my back pocket of my loose-ish (not jeans) pants and by the time I got home 30 minutes later, it was completely liquified. It wasn't very warm out either, I remember it being ~10C.
One of the reasons why "discovered" is more accurate than "invented" most of the time. It was always there in nature, someone was fortunate enough to stumble upon it and pioneer the discovery.
>>> used among Juggalos to deter computerized facial recognition - of course we know that's not the case.
Yes and no. Such levels of makeup are meant to be a mask. They are meant to at least partially obscure a person's true features or emotions. Facial recognition is having a very difficult time with women's faces under heavy makeup. Styles with strait lines or geometric shapes (right angles) really throws them off. I'd expect that no facial recognition system could yet survive comicon.
It’s not surprising that facial recognition has trouble with makeup... if you’ve seen some of these photos of celebrity women with no makeup you wouldn’t recognize them in a million years.
Very surprising to me! I thought facial recognition was supposed to work on the relative size and separation of major anatomical features — eyes, nose, mouth, jawline, ears, that sort of thing.
Any such recogniser which can be fooled by the usual surface details, especially given that roughly half the adult population of rich western nations have significant social pressure to apply such decoration, it would not be fit for purpose.
That said, the examples given in the article seem rather more extreme than any conventional makup. I can easily believe this would fool an AI that scored perfectly on conventional (non subculture) western levels of makup.
It's interesting that makeup does a great job of fooling people (that's a big part of why it exists) yet such a poor job of fooling machines. It shows/reminds that machines do not see the world the wy people do.
The article says in some cases, for some reason, but the link the article references for this statement is less ambiguous and actually states unequivocally that it's "less effective than juggalo makeup"[0]
> The title of this article seems to imply that face painting was intentionally used among Juggalos to deter computerized facial recognition...
I find it so interesting how we all interpret text so differently.
As I said in another comment, Knowing how many discoveries throughout history were completely accidental I actually interpreted the headline to mean they “stumbled upon” a method to beat facial recognition.
Its neat how we all infer our own meanings from headlines, at least here in this community we generally assume good faith rather than jumping to assumptions that the authors are bad people.
And you’re right about the CVDazzle, the cyberpunk nerd in me desperately hopes something like this becomes a fashion trend.
Not only that but the article implies that they "beat" facial recognition and yet the makeup in question wouldn't fool Apple's FaceID, which is probably the largest facial recognition platform in the wild, because it's not just using camera data. This might fool Facebook's matching algorithms from pictures but it's not going to ruin anyone's day from a security standpoint.
> [...] and yet the makeup in question wouldn't fool Apple's FaceID, which is probably the largest facial recognition platform in the wild, because it's not just using camera data.
They should seek funding to help them continue to disrupt the facial recognition space. The entrepreneurial spirit really is alive everywhere you look.
Checked the cvdazzle site. Why do these models - and models on "high fashion" photos in general - look always like they feel deep contempt towards the viewer? Where did this trope come from? What are they trying to suggest? I don't get it.
Sometimes it's for a better picture (looking up from under lowered eyebrows, for instance), and sometimes it's just how their faces look - consider the phenomenon of "resting b*&^! face".
Many people do consider a neutral face to be aggressive or unpleasant, studies show, regardless of the intent of the person with the face.
Especially to Western eyes, resting faces--some in particular--can often be interpreted that way because we're so used to forced smiles. This is particularly true for strangers because we don't have the experience to read their individual emotional expressions.
> So Jason Rogers and Abbe Macbeth, behavioral researchers with international research and innovation firm Noldus Information Technology, decided to investigate: Why are some faces seen as truly expressionless, but others are inexplicably off-putting?
> ...
> The researchers enlisted Noldus’s FaceReader.... The software...analyzes the image and assigns an expression based on eight basic human emotions: happiness, sadness, anger, fear, surprise, disgust, contempt, and “neutral.”
> To establish a baseline, Rogers and Macbeth first had FaceReader assess a series of genuinely expressionless faces. Those expressions registered about 97 percent neutrality, Macbeth said...
> Then they plugged in photos of RBF all-stars Kanye West, Kristen Stewart and Queen Elizabeth. Suddenly, the level of emotion detected by the software doubled to six percent.
> One particular emotion was responsible for the jump: “The big change in percentage came from ‘contempt,’” Macbeth said.
Re: forced smiles. I encourage other adults not to say 'smile' when taking a photo of their children. Their natural expressions are often funnier and more insightful especially years later.
I know this is subjective but I just don’t see “disdain and contempt” (which assumes the model is trying to communicate something about the viewer) in the sample you provided. I see possibly “secretiveness” and “a challenging disposition” (I assume the model is trying to convey something about herself).
Narrowing the eyes looks like disdain to you, but that's a side effect. Widening eyes draws attention to the eyes. Clothing models shouldn't draw attention to their eyes.
If I had to pose for a camera while wearing ridiculous-looking clothes and/or makeup, I'd have some pretty deep contempt for everyone involved (including the viewer) as well.
It actually does, and with surprising frequency. Professional models in the high-fashion world often endure considerable amounts of work-related stress, and the decision to do a particular job is often less of an "I like doing this" and more of an "I have to do this to stay relevant / make ends meet / satisfy contract obligations / etc.". Even a seemingly-normal photoshoot can end up being excessively taxing both physically and mentally; the viewer only sees the end result, not the hours of standing around holding poses while the photographers try to get the photo oh so perfect or the hours of wardrobe and makeup prep or the overwhelming pressure to not eat (or - worse - to purge after meals) in order to remain palatable to the mass market and its unrealistic-bordering-on-downright-impossible body image expectations.
My original comment was a lame joke, sure, but it's one with a disturbing amount of truth behind it. Not all models like being models. I'd wager that hating the job might even be the norm. It has a high potential to be excessively stressful and even outright degrading, not to mention that it's a hotbed for shady business practices, outright fraud, and abuse (psychological, physical, and sexual alike). It's a career path notorious for its workers having little to no bargaining power.
I suspect that the folks voting my comment down don't actually know how shitty of a job modeling can be. I don't blame them for not knowing; on the surface, being a model seems fun and glamorous, right? Never mind that - like how office work is (usually) more than sitting around playing Solitaire all day - professional modeling is (usually) more than just hanging out all day and getting the occasional photo taken real quick (there is no "real quick" to these sorts of photo shoots).
I agree with you. The Juggalos painted their faces but someone else discovered its effectiveness in fighting facial recognition.
FTA:
> According to Twitter user @tahkion, a computer science blogger for WonderHowTo, Juggalo makeup outmatches the machine learning algorithms that govern facial recognition technology.
While I think it's impossible to do with this crowd, because we've already been biased by this conversation, it would be interesting to run the phrases "Juggalos figured out how to beat facial recognition" and "Native Americans figured out the secret to a healthy lifestyle" (or some other positive association from a positively associated group) randomly across focus groups.
I suspect people will come out with more problems accepting "figured out" for a negatively associated group than a positively associated one, even through I think the meaning of that wording is identical in both phrases.
This application sketch describes a high quality method for separating detail from overall image region intensity. This intensity-detail decomposition can be used to automate some specialized image alteration tasks. Our work was motivated by the movie 102 Dalmatians, which featured a dalmatian puppy without spots. Animal
handlers determined that the obvious idea of applying makeup to the dog was not possible – evidently there was no suitable makeup that was both safe to the dogs and that would stay in place during
normal dog activity (including licking). This left Disney TSL with the task of painting out all the spots on the dog every few frames (the paintings were carried for a few frames with a simple scheme).
The spot removal task was larger than anyone guessed, and ultimately required a large number of artists (up to 40) working for eight months. The problem also proved to be more difficult than
expected from an algorithmic point of view. As the spots often had visible fur texture, initially it was believed that there must be some simple compositing technique that could lighten these spots.
More specifically: Juggalos figured out how to beat facial recognition tech trained on normal faces and trying to recognize normal faces.
While the black and white paint does seem to negate a lot of natural contrast human faces have – and on which the recognition of faces is currently based – there seems a lot of information left to recognize specific faces, especially when considering stereovision or even possible pulsed laser reflection / LIDAR based solutions (think long-range iPhone X sensor).
Even if the recognition of an individual within a database of millions yields too many false positives, the set can be further refined by considering other subject metadata, say geolocation or social connections.
While considering this, it's still important to not succumb to privacy nihilism. Not everything is lost, but it's hard to practice that when 'beating' facial recognition tech means "looking like a fucking juggalo" [1].
I would imagine there is a pretty strong first amendment case against such laws in the United States. Mask laws have been successfully challenged on those grounds (though sometimes the challenges have been unsuccessful).
I thought we learned who the first Stig was, out of show, and then they replaced him with someone else. I seem to recall reading that here a few years back.
A couple of Stigs actually have been revealed per the Wiki on the character -- https://en.wikipedia.org/wiki/The_Stig -- Perry McCarthy (black Stig) and Ben Collins (first white Stig). Both were revealed out of show of course.
We don't know who the current Stig is and it's entirely possible in my opinion that there have been "stand in Stigs" as well. Obscuring the face with a tinted-shield racing helmet eliminates a big method people tell others apart. You'd have to rely on clues like height / build instead. Makes it easier to swap out drivers for that character.
OK, but it's not a thing in the US. I pretty much always wear dark glasses and a big floppy hat in the summer, and have never been challenged at my bank. But then, maybe that's because I'm that guy in that hat ;)
I'm American. I can confirm my bank has a sign that says "no hats, no hoods, no sunglasses" on the door. The enforcement seems lax; if you have certian skin colors at least.
Huh. Maybe I've just never seen the sign. Or maybe they don't have such signs in my sleepy little town. And then, there's the fact that I'm an old ~white guy, and everyone knows me. I mean, I have received mail with no street address. Just name, city, state and five-digit postal code :)
But here's the thing: Nobody except my ISP even knows that I use a VPN. Nobody knows that I'm interested in online privacy and security issues. Because I only get into that online, using VPNs and Tor.
I strongly support anti-mask laws... especially in situations when you have two political opponents coming up to one another in the streets. Masks make it way too easy for some kind of provocateur to instigate fights or start a stampede. And this may be egged on my professional dark PR/intelligence people to make one side or another look good/bad.
It's a tough thing to defend on legal grounds because you have to explain how mask laws don't violate the first amendment. There could be some justification that would hold up, but as far as I know the issue hasn't gone before the Supreme Court.
Realistically, I think your right to free expression is threatened by the way you can be targeted for harassment or fired if you exercise it, and the mask is one simple remedy. Basically it makes sense for the same reason the anonymous ballot does. No less plausible than the idea that your freedom of speech is abridged if you can't donate millions of dollars to a political campaign.
Yep, which shows why some of the recent push to criminalize hate speech is problematic: it's used to target fringe groups most people don't like now, but in the future it could be used against anyone else, similar to how mask laws have been used against protesters more recently.
No. It won't. We'll just get used to it. Face identification was solved decades ago. Since then, it has just been a rat race between "what about this occlusion or this one?" Train a Neural Net to do Paint to Skin replacement, or Adobe's context aware brush to remove overly gel'd hair (the cvdazzle linke).
Beards and 2010 Justin Bieber hair cuts halted face recognition 10 years ago. How do I know, because I had to manually label 10,000 frames with 200+ points because the "automatic landmarking tool" didn't put everything in the right spot.
Biometric security is, and always will be, a rat race. Once faces are handled / ruled out, they've move to the iris (30ft/10m iris extract was possible 10 years ago). After the iris (cause everyone will wear Black Mirror contacts or something), they'll move to gait analysis. My thesis was on identifying people based on the facial movements.
The point is, at some point, they're not going to "make it illegal" because it'd be improbable for everyone to adopt this cyberpunk fashion. Plus, their are enough other ways to identify someone as someone than the face that most future systems will likely be a combination of all of them using some level of confidence.
Denmark did this recently with any type of face-covering, and judging by how Reddit gushed over the decision it should be no trouble to enact similar legislation elsewhere:
"Accurate" isn't a terribly meaningful description. We really need to talk about sensitivity and specificity. If you're looking for a very low-probability event (a terror suspect in a random crowd) then your facial recognition algorithm needs to have phenomenally high specificity to be useful, otherwise you'll be glutted with false positives.
This isn't a problem isolated to facial recognition - many cancer screening tests are worse than useless, because the harm of unnecessary treatment due to false positives outweighs the benefits of early detection.
> People are constantly trying to come up with ways to work around facial recognition technology using everything from rigged hats (if you’re out in public) to heavy pixelation (if you’re online).
Why does "online" necessarily involve facial images? I've used this image[0] for several years. I did use a cropped and fuzzed version of this[1], for a while. And other images, for other personas.
Face recognition is a great example of a privacy-toxic technology thwarted by something like GDPR.
As soon as you cross the line from mass recording of anonymous images to recognizing individuals, even in a public space, you require consent.
The only problem is that the main users of face recognition, security agencies and government are not subject to GDPR - that needs to be restricted in a future version because today it wouldn't have been politically feasible.
> Face recognition is a great example of a privacy-toxic technology thwarted by something like GDPR [...] security agencies and government are not subject to GDPR.
Sounds to me like it is the opposite and implicitly condones its use by excluding those organizations. The larger the law-built divide between government rights and non-government rights becomes, the less likely any of us entrepreneurs will be able to counteract it. Next, people will be cheering a law that prevents people from developing facial recognition technology.
Not sure I understand your logic. We moved from a free for all to a significantly restricted regime for personal data. There is, for the first time, a continent-wide recognition of the social value of privacy. This was achieved despite gargantuan efforts to quench it by the affected interests.
And this is a bad result because it doesn't go to the politically impossible extreme? It's a foot in the door, we have a fighting chance for an universal right to privacy. That's the only way to restrict governments and the only way to get that is to convince the general public that it'a a right worth fighting for. The GDPR is a massive step in that direction.
These are similar arguments the proponents of the Patriot act made, lauding the protection it provides and the great achievement in passing. Lots of people don't see this as a big enough issue for the sweeping changes that have been made, and to go from free for all to free for some scares people. We shouldn't confuse the universal right to privacy with the universal requirement for privacy. Not everyone wants the latter and it's not our role to "convince" them as though they are ignorant. Almost everyone agrees with the goals/intent of the legislation, but the implementation leaves a lot to be desired as did its predecessors. They just keep doubling down instead of real pragmatic solutions such as actual enforceable statutes and education. It's so many failures on so many levels, but since it was done with the right idea it seems to get a pass.
I hope there is a better solution than returning to the bauta.
Venice faced a related challenge (in a small city any passerby has a good chance of facial recognition) and it seems that their masks were sometimes used around the year for anonymity.
None of these 'hacks' work because almost all involve putting something on your face or your head.
I think the holy grail would be if someone comes up with a tiny device that you can wear on your ear, or around your neck, that emits something invisible and that which only obscures cameras.
That way, the humans interacting with other humans in public are not looking like clowns in makeup and having normal interactions, while obscuring and befuddling the big-brother cameras.
I tried to research this in 2015 and was told about infra-red ray emitters. Not sure if there's any truth to it, or if it's even possible. If any of you have made any progress on this and would like to collaborate, please contact me. Email in bio.
I'm surprised no one mentioned that you can actually achieve somewhat the same thing and still be mostly unnoticed for human eyes by attaching some infrared LEDs on a scarf, cap or goggles.
It happens to trick current facial recognition systems looking for typical faces... but I imagine it would be well within the possibility to create a "juggalo" filter that simply adjusts the levels until the impact of the makeup is reduced enough to grab a face out of?
So it might thwart some newer learning algos, but I wonder how it would do against the more classical heuristically guided techniques that tend to look at more invariant characteristics like facial feature spacing and ratios.
Maybe you can have advance air support. A drone with an IR laser targeted by a lens detector. But you'd need to avoid targeting people wearing glasses.
But no, that's silly. Really, you just need to give up on the idea of being anonymous in public. Because that was never something that you could count on.
That would likely just flag you as someone trying to obfuscate facial recognition. Making you quite unique, no?
I have noticed that it's very hard to recognize faces of people with very black skin, under low-light conditions. So I can imagine black makeup that's 100% non-reflective across a wide range of wavelengths. But then there's millimeter-wave radar. How would one block reflection of mmW radar?
> Facial recognition is only “beaten” when a person’s normal face can go undetected without looking like a fucking juggalo.
I know this is crudely put and thus being downvoted, but there is a kernel of truth to it similar to the XKCD encyption / wrench[0] comic. Sure, you might've beat facial recognition in the lab, but in a real world scenario you're walking around in ridiculous looking clown face paint so you're going to come under immediate scrutiny regardless.
So you're saying the band that produced the lyric "Fucking Magnets, how do they work?" might not have been pointedly trying to solve the problem of privacy in an evolving digital society?
It's actually a little amazing how many people are stumbling over themselves to point out how one possible interpretation of the title that implies some credit is due to this group that people have a generally poor opinion of couldn't possibly actually have done anything useful.
OR, as was my point, the title as written suggests they were actively engaged in efforts to evade facial recognition, which is a completely different story.
That is one interpretation (as I covered elsewhere in these comments, "figured out" is used often in to describe discoveries that were accidental in nature. Interestingly, rarely to people complain about the wording when it's a white guy from 50 years ago), and I the fact that people are both jumping to that conclusion here, are unwilling to consider their interpretation might be biased, and feel the need to comment on this when it's already been noted and in numbers that I think are substantially higher than would be seen if the group in question did not have the same reputation, speaks volumes.
Is it actually so implausible that one of more people that consider themselves Juggalos (or associate with the group) notices they weren't being tagged quite the same way when in makeup, and investigated? If it's not so implausible, why is everyone rushing to point out it's probably not what happened, and if it is implausible, why do you think that? I don't know whether the discoverer is a Juggalo. I suspect he's not, from his Twitter feed, but I can't be sure. I've had plenty of friends into similar tech scenes and also into some weird shit (and I've been to a Defcon in the early 2000's, which is all the extra evidence I need). The fact that so many people seem willing to assume he's not and present that as fact (how many people clicked through to the twitter feed it to see if the person was associated with Juggalos?) in a refutation of the title (which only makes sense with that assumption), suggests something to me.
Sure, but this isn't a story of people who were making a concerted effort to avoid facial recognition, which the title's wording implies for a little bit of sensationalism.
It’s interesting — knowing that so many major discoveries through all of history were completely accidental I didn’t read it this way at all. I actually interpreted it to mean they “stumbled upon” a way to defeat facial recognition.
It never ceases to amaze me how we can all read and infer such different meanings from text.
This reminds me of a sort of creative look into futuristic fashion I saw a bit ago:
https://cvdazzle.com
Where the fashion of the future is made to be expressive while preventing mass surveillance.