Developers Created AI To Generate Police Sketches. Experts Are Horrified 115
An anonymous reader quotes a report from Motherboard: Two developers have used OpenAI's DALL-E 2 image generation model to create a forensic sketch program that can create "hyper-realistic" police sketches of a suspect based on user inputs. The program, called Forensic Sketch AI-rtist, was created by developers Artur Fortunato and Filipe Reynaud as part of a hackathon in December 2022. The developers wrote that the program's purpose is to cut down the time it usually takes to draw a suspect of a crime, which is "around two to three hours," according to a presentation uploaded to the internet. "We haven't released the product yet, so we don't have any active users at the moment, Fortunato and Reynaud told Motherboard in a joint email. "At this stage, we are still trying to validate if this project would be viable to use in a real world scenario or not. For this, we're planning on reaching out to police departments in order to have input data that we can test this on."
AI ethicists and researchers told Motherboard that the use of generative AI in police forensics is incredibly dangerous, with the potential to worsen existing racial and gender biases that appear in initial witness descriptions. "The problem with traditional forensic sketches is not that they take time to produce (which seems to be the only problem that this AI forensic sketch program is trying to solve). The problem is that any forensic sketch is already subject to human biases and the frailty of human memory," Jennifer Lynch, the Surveillance Litigation Director of the Electronic Frontier Foundation, told Motherboard. "AI can't fix those human problems, and this particular program will likely make them worse through its very design."
The program asks users to provide information either through a template that asks for gender, skin color, eyebrows, nose, beard, age, hair, eyes, and jaw descriptions or through the open description feature, in which users can type any description they have of the suspect. Then, users can click "generate profile," which sends the descriptions to DALL-E 2 and produces an AI-generated portrait. "Research has shown that humans remember faces holistically, not feature-by-feature. A sketch process that relies on individual feature descriptions like this AI program can result in a face that's strikingly different from the perpetrator's," Lynch said. "Unfortunately, once the witness sees the composite, that image may replace in their minds, their hazy memory of the actual suspect. This is only exacerbated by an AI-generated image that looks more 'real' than a hand-drawn sketch."
AI ethicists and researchers told Motherboard that the use of generative AI in police forensics is incredibly dangerous, with the potential to worsen existing racial and gender biases that appear in initial witness descriptions. "The problem with traditional forensic sketches is not that they take time to produce (which seems to be the only problem that this AI forensic sketch program is trying to solve). The problem is that any forensic sketch is already subject to human biases and the frailty of human memory," Jennifer Lynch, the Surveillance Litigation Director of the Electronic Frontier Foundation, told Motherboard. "AI can't fix those human problems, and this particular program will likely make them worse through its very design."
The program asks users to provide information either through a template that asks for gender, skin color, eyebrows, nose, beard, age, hair, eyes, and jaw descriptions or through the open description feature, in which users can type any description they have of the suspect. Then, users can click "generate profile," which sends the descriptions to DALL-E 2 and produces an AI-generated portrait. "Research has shown that humans remember faces holistically, not feature-by-feature. A sketch process that relies on individual feature descriptions like this AI program can result in a face that's strikingly different from the perpetrator's," Lynch said. "Unfortunately, once the witness sees the composite, that image may replace in their minds, their hazy memory of the actual suspect. This is only exacerbated by an AI-generated image that looks more 'real' than a hand-drawn sketch."
source code or you must aquit! (Score:3)
source code or you must aquit!
Re: source code or you must aquit! (Score:5, Funny)
Re: (Score:3)
you mind telling the jury what git is?
Question (Score:5, Informative)
with the potential to worsen existing racial and gender biases
Could someone please tell me how gender biases affect sketch portrait generation?
Re:Question (Score:5, Insightful)
Their progressive brain was on autopilot, delivering the "race and gender" bit and forgot that nobody really cares that men heavily outweigh women in crime stats.
Maybe they're worried a very ugly woman might be considered a man?
Re: (Score:2)
Re: (Score:2)
Please elaborate/exemplify, especially in relation to "gender bias".
Re: (Score:3)
The data it was trained on was bias. Therefore it will output bias values.
It draws what it is told to draw. A witness has to describe/pick features.
Re: (Score:2)
Re: (Score:3)
A sketching tool does that. An AI tool presumably uses information from some huge training set and that set could easily be biased
This is a sketching tool. It is driven by human inputs. The AI has to do with photorealistic output rather than pencil sketch output. In either case the characteristic are from user descriptions or selections. Selections as in it generates two images, say with different eye shapes, and then alternates presenting the two asking which set of eyes is closer.
Re: (Score:3)
The art of prompting AI to generate the image you want is a whole art-form in itself. Most of the questions on the various AI related subreddits are along the lines of how "how do I prompt it to draw X" or "how do I stop it generating Y". There are even some academic studies into the subject, and one of the main services that OpenAI sells is assistance with writing prompts.
The humans who draw police sketches are, or should, be trained to avoid giving the witness feedback that influences them. Part of that i
Re: (Score:2)
Re: (Score:3)
Re: (Score:3)
Re: (Score:3, Insightful)
with the potential to worsen existing racial and gender biases
Could someone please tell me how gender biases affect sketch portrait generation?
There is no gender or race "bias" in police sketches.
You will almost certainly know if you were robbed by a man or women, and "I was robbed by a black guy" is a statement of fact, not a "racial bias". This is just another attempt to hide the fact that certain people commit a hugely disproportionate mount of crime.
Re: (Score:2)
While I am inclined to give some leeway to the idea that race bias could be a thing (e.g. tendency to unconsciously assign a darker skin shade to a suspect), the gender bias objectively threw me off.
"I swear, officer, he was the manliest man I have ever seen!"
Re: (Score:2)
Re: (Score:1, Flamebait)
It's funny that you count among "normal" the people who think someone so troubled as to try to change their gender like Rachel (born Richard and fathered three children) Levine is the right person to be the nation's assistant secretary of HEALTH. Anything else -- transportation, housing -- I don't care, but HEALTH? That's worse than giving that position to someone who was morbidly obese before going through a bariatric surgery and bouncing back on their weight.
Also, it's those who call people racist to sile
Re: (Score:2)
Back when being gay was illegal, lots of gay men married women and fathered children. It still happens now, though less frequently, due to social pressure.
Re: (Score:2)
I'm all for people being free to make individual choices. Norhing against Levine transitioning, it is Levine's choice and I am fully in favor of Levine living in a society where Levine is free to make that choice. I am pointing out at the insanity of those who put Levine in that particular office and those who think this is "normal", especially given that Levine uses that office to push Levine's ideology on others.
Re: Question (Score:2)
I don't think you'll ever be mistaken for a man.
Re: (Score:2)
... then suddenly the sketch generated by the AI has breasts and wide hips because ...
Because that is what a witness said. Sketch artists ask about such characteristics, sketch artists and AI only go with typical features if a witness has no recollection. If there is some partial unclear recognition then it goes into a selection mode, more like A, or more like B, repeat,... sort of like when getting eyeglasses.
Also, sketches are usually just the face.
And we already know this is an issue because the same thing happens with facial recognition tech
A complete different technology. With sketches, artists or AI generated, the sketch is only used if it is shown to a witness and they say y
Re: (Score:2)
But in reality you'll still adhere to gender norms, and anyone you see with long hair and a dress you'll by default assume it's a woman. And when these biases seep into the way an AI is trained, someone gets robbed by such a person, then suddenly the sketch generated by the AI has breasts and wide hips because "that's what people with long hair and a dress look like" despite the witness saying nothing about that and the AI doesn't know any better.
So you're complaining here that if you're mugged by a transvestite wearing women's clothing, the AI sketch system will incorrectly create a sketch of a woman instead, and the transvestite mugger will go scot-free. I don't have statistics but I would imagine such a scenario is not very common, so if this was is the biggest problem with this program (it isn't), then it doesn't have too many problems. Also, it's a bit unusual (but admirable) to see a clearly very woke person be so concerned that transgender cr
Re: (Score:1)
Re: Question (Score:2)
This is just another attempt to hide the fact that certain people commit a hugely disproportionate mount of crime.
Yep. Poor people.
Re: (Score:2)
with the potential to worsen existing racial and gender biases
Could someone please tell me how gender biases affect sketch portrait generation?
Research has shown that humans remember arguments holistically, not word-by-word. A writing process that relies on individual word descriptions like this reporter can result in an argument that's strikingly different from the intended subject.
(i.e. the same cognitive biases that caused the reporter to write "racial and gender biases" also causes sketch portrait generation to be subject to bias)
Re: (Score:2)
How often is it to witness a crime performed by males (or females) and turn their gender around when asked to help with a police sketch?
Re: (Score:2)
How often is it to witness a crime performed by males (or females) and turn their gender around when asked to help with a police sketch?
That's not the point I was making.
The human brain is full of bias and loves making shortcuts.
One of those shortcuts, "we're talking about bias so I'll just say 'racial and gender biases' because that usually covers them".
Another shortcut, "oh the perpetrator looked kinda dark, oh you're showing me a picture of a black gang member, yeah, that must be them".
Re: (Score:2)
So... it's a shitty journalistic, inflammatory shortcut to attract clicks and comments.
Re: (Score:2)
with the potential to worsen existing racial and gender biases
Could someone please tell me how gender biases affect sketch portrait generation?
Can someone tell me how racial biases affect sketch portraits? Any more than height or build? I understand how people can confuse some similar ethnicities, but in the end, either the sketch looks like the perp or it doesn't. And then if someone is questioned or put in a lineup because of a sketch description, they still have to be eyeballed by a flesh and blood witness for confirmation.
Maybe, say, an old white lady is afraid of them big, scary black men, but if the perp is a muscular, 6ft tall black man, th
Re: (Score:2)
Can someone tell me how racial biases affect sketch portraits?
"It was some Mexican guy. I didn't get a good look, but his skin wasn't white [salon.com]."
"Oh he was clearly a Mexican, or one of those people. No doubt about it [youtube.com]."
Re: (Score:2)
That's bias in the witness description, not the sketch portrait. Given that the portrait is based on the description, that means there will be bias in the sketch, but the bias was not introduced in the production of the sketch but in the description. I don't see how you can solve that by focusing on the production of the sketch, unless you have the sketch intentionally disregard the witness description, which opens a whole other can of worms.
Re: (Score:2)
People tend to paint perpetrators in thicker features. Darker skin shade, more prominent racial features, etc.
These mistakes are well documented.
But gender?
Re: (Score:2)
Their objections are a Pavlovian reflex, which is unsurprising.
They seemed unconcerned about whether it will help capture violent offenders, which, regrettably, is also unsurprising.
Re: (Score:2)
Their objections are a Pavlovian reflex, which is unsurprising.
Iddn't that what your'e doing? ;)
Re: (Score:2)
Could someone please tell me how gender biases affect sketch portrait generation?
My wholly uninformed guess: there's a wide spectrum of gendered appearances you might draw (e.g. for women from butch, feminine, androgynous). They're all women. But preconceptions or biases in the training set, perhaps triggered off keywords in the subject's description, might bias what gets drawn. For instance if a victim reports "she hit me violently with a lead pipe" the machine-learning system might have formed associations between "violent" and butch-looking women. Just a guess.
Re: (Score:2)
It's still the same gender.
Now, if they said "appearance bias" - then it would make sense.
Re: (Score:2)
It's still the same gender.
Lots of people use the word "gender" to mean binary male/female based on chromosomes or primary sexual characteristics. Others use it to refer to a continuum of the socially constructed expectations. My suggestion was that they were using it in the latter sense, in which case (say) presenting as androgynous would be a matter of gender, hence why the post makes sense. You used it in the first sense in which case it doesn't make sense.
Re: (Score:2)
Could you please exemplify?
Re: (Score:2)
with the potential to worsen existing racial and gender biases
Could someone please tell me how gender biases affect sketch portrait generation?
Maybe the headline was generated by ChatGPT, which uses the most statistically sound phrases.
However on a serious note, the problem is with human memory - it's fragile and suggestive, and such realistic sketches will influence witnesses - the end of the headline addresses the problem well.
Re: (Score:3)
Eyewitnesses are the most unreliable witnesses around - they almost always get it wrong, and if the perpetrator is of a certain skin tone, well, simple suggestion will then make it true. Having someone say "The guy who attacked was black" makes all the witnesses say he's black, even if he was a different skin tone.
Heck, once a sketch is released, all eyewitness accounts for the culprit will start to look like the sketch.
This is
Re: (Score:2)
I understand all that, and it all relates to race.
But gender???
Re: Question (Score:2)
You're right, of course - there is no room for gender bias. But, for that matter, where's the race bias? If a witness says "white" or "black", or "Hispanic", or whatever - the sketch should reflect that information.
The only question worth asking is: does it work as well as a human sketch artist?
Re: (Score:2)
Race bias does exist and is well documented.
Now, whether it works as well as a human sketch artist... My opinion is that it can, as long as it's properly used, much like any other tool.
Re: (Score:2)
In this case, my guess is that the critics are working for the human sketch-artists who might lose their jobs if this works better; even though the same criticisms apply to them
Re: (Score:1)
Let's see (Score:2)
"The program asks users to provide information either through a template that asks for gender, skin color, eyebrows, nose, beard, age, hair, eyes, and jaw descriptions or through the open description feature, in which users can type any description they have of the suspect"
Yes, yes, yes, yes, yes, yes, yes, yes, and yes - the criminal had all those things.
If it matches, it matches? (Score:5, Interesting)
It's not as if convictions are based solely on sketches. This is another attempt to make a controversy out of a non-controversy.
If a witness saw someone that was pink-with-purple-polka-dots, they describe a suspect that was pink-with-purple-polka-dots, and the image generator/artist draws someone that is pink-with-purple-polka-dots...isn't that how it's supposed to work?
Re:If it matches, it matches? (Score:5, Insightful)
None of the complaints matter. Does it do a good job?
This can be tested. Take a subject and let 100 people feed their own descriptions to the AI. Then use that to help identify people via face recognition of a big ass database of random people. If you're worried about race, limit the search dbc only to people in the general vicinity of the witness' claimed race.
How many face descriptions (100 different ones) led to a picture that got the correct answer?
Now repeat that with another hundred test subjects.
Re: (Score:3)
None of the complaints matter. Does it do a good job?
This can be tested. Take a subject and let 100 people feed their own descriptions to the AI. Then use that to help identify people via face recognition of a big ass database of random people. If you're worried about race, limit the search dbc only to people in the general vicinity of the witness' claimed race.
How many face descriptions (100 different ones) led to a picture that got the correct answer?
Now repeat that with another hundred test subjects.
That's the wrong test.
Here's a better test. Once you've got a match (true or false) go back to the witness and ask if it's the same person they saw.
I'm guessing the person who saw the hyper-realistic sketch (that totally looks like the person they found) is way more likely to say it's the correct ID whether or not it is the correct ID.
The risk with a sketch artist is always that you end up changing the memory you're trying to capture, hyper-realistic sketches make this risk worse.
Re: (Score:2)
We have previous cases with actual results - use them.
Re: (Score:3, Insightful)
The more realistic the depiction, the more it colors the witnesses memory. Most of the time they will have viewed the depiction far longer than the brief moment they viewed the perpetrator
Re: (Score:2)
Problem is that face recognition is well known to have racial bias. Accuracy rates for dark skin are much lower than for light skin.
You would also need to include tests where the subject is not in the database at all, and train the AI with a much larger set of data.
Another issue is that a lot of the training databases are full of biased descriptions, which the witnesses will also give. Phrases like "angry/dangerous looking", or "dark skinned". The police are actually well aware of this - "BBG, BBG" has beco
Re: (Score:2)
It isn't that easy. Memory doesn't work the way a video camera does, especially if the situation is stressful or the opposite. How many faces of people from your daily commute do you remember?
False memories are real and have been extensively researched. So has the fact that memory can become fragmented during high-stress situations. We also know that recalling a memory (such as when telling or retelling a story) changes that memory.
Re: (Score:2)
Part of the problem is that once the system latches on to someone, police and prosecutors have every incentive to continue trying to pin it on that someone.
It's a system that rewards finding people guilty, rather than finding guilty people. If history is any guide, this new capability will be used to reinforce that system rather than improve it (It's from a computer! It must be right! The witnesses that disagree are simply wrong!)
A current example in 'the system can never admit a mistake' - tomorrow, the st
Re: (Score:2)
a) Laws against lazy sea-lioning and pretending that 'just asking questions' is a legitimate way to participate in a discussion.
b) Actually following the eighth amendment and prohibiting excess bail. The system uses that to extort guilty pleas.
c) Actually funding public defenders. So that there's someone in the mix who can protest when rights are abused.
d) DISCUSSING THE PROS AND CONS of a technology, like people are doing here, rather than blithely assuming that it will always be used correctly.
BTW, a) is
Re: (Score:2)
e. Drive incompetent George Soros backed prosecutors out of office.
Re:If it matches, it matches? (Score:5, Insightful)
It's not as if convictions are based solely on sketches. This is another attempt to make a controversy out of a non-controversy.
If a witness saw someone that was pink-with-purple-polka-dots, they describe a suspect that was pink-with-purple-polka-dots, and the image generator/artist draws someone that is pink-with-purple-polka-dots...isn't that how it's supposed to work?
Not entirely, but they are based on eye-witnesses, and that's what this AI can really screw up.
The summary completely buries the lead but the key quote is here:
"Unfortunately, once the witness sees the composite, that image may replace in their minds, their hazy memory of the actual suspect. This is only exacerbated by an AI-generated image that looks more 'real' than a hand-drawn sketch."
Think of it this way, you witnessed the perpetrator but you just got the one look at them and are now relying on memory. Now the sketch AI comes up with this "hyper-realistic" photo that looks similar, well now instead of the perpetrator you start remembering the "hyper-realistic" photo.
And when the cops arrest someone on the basis of that photo and ask you to ID them, well your memory of the perpetrator has now been replaced by your memory of the photo and you honestly answer "yup, that's the person".
Now you've got an eyewitness identifying that innocent person as the perpetrator and you get your false conviction.
Re: (Score:2)
The problem is with witness descriptions, which are badly unreliable in general and should be mistrusted unless the person was already somehow known to the victim, not really this tool.
But this is a tool for police to find people to investigate and then a jury can decide if it all adds up or not. So when you find the guy whose cell phone was in the area, whose likeness shows up in a video canvass of the area, whose DNA is at the scene, and who matches the sketch produced, the jury gets to decide if that's
Re: (Score:2)
There is a simple solution to that, which AI is an amazing enabler of because it requires so little human effort to produce the image. Have the AI produce a range of images that match the description and *THEN* get the witness to pick the best one. Problem solved, and in fact, I would say it is better than a human sketch artist coming up with just one image with all the sketch artist's biases. Given this system is not yet being used by anyone and is only a prototype, then these are just the sorts of things
Re: (Score:2)
There is a simple solution to that, which AI is an amazing enabler of because it requires so little human effort to produce the image. Have the AI produce a range of images that match the description and *THEN* get the witness to pick the best one.
You're just creating more images to confuse the witness. Every time you try to read from that memory you're corrupting it a bit, especially if you're showing them a photo.
Problem solved, and in fact, I would say it is better than a human sketch artist coming up with just one image with all the sketch artist's biases. Given this system is not yet being used by anyone and is only a prototype, then these are just the sorts of things you expect to be worked out before it goes into actual use.
They're just feeding the text to ChatGPT, I don't think they're in a position to make many improvements to their prototype because it's built upon an AI that's not designed for that and they have no capability to modify.
And the sketch artist certainly does have biases, but they're also in an iterative process with the witness, trying to re
Re: (Score:2)
To say nothing of the fact that witness sketches aren't drawn in a vacuum.
At least the way it works with humans is you tell them some stuff, they draw it, you edit, they adjust, etc until the result looks like who you saw.
Presumably, the AI system would do the same - 'thicker eyebrows, more toward the nose' at which point the ai would re-gen and check again.
This is bullshit racialism masquerading as something real.
Re: (Score:3)
It's a signal to noise ratio thing. Most of the information in the AI image is going to be noise. It's probably worse than not having a sketch at all. If the AI produces something that looks too much like a photograph, people who see it will think that it's an actual photo of the suspect - and will not recognize the suspect because they don't look like the "photo". It's going to be wrong in all kinds of little ways because you don't have control over every little detail in the image. The photorealism implie
Follow up article soon to follow.. (Score:2)
Re: (Score:2)
AIs are the toys of the rich to be turn into the tools to control even more so.
But not sketch artists? (Score:4, Insightful)
“The problem with traditional forensic sketches is not that they take time to produce (which seems to be the only problem that this AI forensic sketch program is trying to solve). The problem is that any forensic sketch is already subject to human biases and the frailty of human memory,”
Where were the objections to police sketches before AI became involved?
Re:But not sketch artists? (Score:4, Interesting)
Because the police sketch artists can apply the require local 'anti-racism' bias to make sure the sketch doesnt look too much like a minority, no matter how strongly the victim identifies them as one.
AI will just do what it is hold, uuntill they tune in a required political bias, probably.
Exactly this lead to a large number of young girls being gang raped in the UK without proper investigation a decade or two ago, and I can only imagine it has got worse.
Re: (Score:3)
Here are some:
https://www.todayonline.com/da... [todayonline.com]
https://www.gnsmithlaw.com/blo... [gnsmithlaw.com]
https://www.connectsavannah.co... [connectsavannah.com]
Use an Evolving Image (Score:4, Interesting)
Research has shown that humans remember faces holistically, not feature-by-feature.
So surely using AI can offer a huge improvement over a sketch artist here. You may give it an initial description of the person and it can then generate say 3-4 images based on that. The witness selects the one that looks closest and it generates 3-4 more based on that. This way the AI can literally train itself to generate an image of the face while the witness simply selects the closest match using their holistic memory of the face.
I do not see why this would be any more biased than using a human sketch artist. Indeed, while AI algorithms can show bias it is much easier to identify and fix that bias than it is with a human sketch artist and once fixed everyone gets the improved version.
Re: (Score:2)
It could start with less detailed sketches and move up to more detail each iteration.
Re:Use an Evolving Image (Score:4, Interesting)
Research has shown that humans remember faces holistically, not feature-by-feature.
So surely using AI can offer a huge improvement over a sketch artist here. You may give it an initial description of the person and it can then generate say 3-4 images based on that. The witness selects the one that looks closest and it generates 3-4 more based on that. This way the AI can literally train itself to generate an image of the face while the witness simply selects the closest match using their holistic memory of the face.
I do not see why this would be any more biased than using a human sketch artist. Indeed, while AI algorithms can show bias it is much easier to identify and fix that bias than it is with a human sketch artist and once fixed everyone gets the improved version.
It should be noted you're describing a system that isn't remotely similar to what is proposed here.
All they're actually doing is feeding the descriptions to DALL-E 2 and sending back the images.
There's no back-and-forth iterative procedure, it's just DALL-E 2 generating some images based on the text.
Here's an experiment, go to DALL-E 2 [openai.com] and type in "Tom Cruise". If your experience is like mine you'll get some images that look kinda like Tom Cruise, but are definitely not Tom Cruise.
Now here's the kicker, retry it with a celebrity whose face you don't remember nearly as well.
Now, when you try to remember them, are you thinking of their face, or are you thinking of the DALL-E 2 facsimile?
This is why it shouldn't be used for actual police sketches.
Re: (Score:3)
Now, when you try to remember them, are you thinking of their face, or are you thinking of the DALL-E 2 facsimile? This is why it shouldn't be used for actual police sketches.
That is an argument against all police sketches not against one method vs another.
Re: (Score:2)
Now, when you try to remember them, are you thinking of their face, or are you thinking of the DALL-E 2 facsimile? This is why it shouldn't be used for actual police sketches.
That is an argument against all police sketches not against one method vs another.
It's an issue with police sketches, but remember, police sketches are just sketches, they're not photo-realistic so they won't have nearly the same impact.
This proposed method is presenting users with what looks like photos, the potential to alter memories is much, much greater.
AI can't fix those human problems... (Score:1)
I'm not so sure. If the witness is able to reject/accept adjustments faster then doesn't that allow them to get to a good representation faster, giving them less time to forget what they saw? Doesn't a realistic image make the bias the police officer might have less of a factor since they have less room for interpretation?
Re: (Score:1)
Where "experts" are horrified by reality (Score:1, Interesting)
I'd expect this to be LESS biased (Score:3)
Assuming the data set is "people" and not "convicts", I'd expect the output to be less biased than a human sketch artist.
If you tell the artist the person who mugged you was of race X, and the artist is unconsciously biased against race Y, they might unintentionally slip in physical characteristics associated with race Y. A model trained on a representative sampling of "people" presumably wouldn't make this mistake.
Beyond this, it's not immediately clear how much of a problem "sketch artist bias" is. How often are people wrongly arrested due to a sketch? How frequently are people convicted due to one?
From the headline, I expected the "experts" in question to be sketch artists who are afraid of losing a job. If I were more conspiracy minded, I'd think that such sketch artists are encouraging this silly outcry.
So? (Score:2)
It's America's Most Wanted all over again. (Score:1)
Re:It's America's Most Wanted all over again. (Score:5, Funny)
Police sketches (Score:2)
They all looked like Chief Wiggum?
All ShutterFly Models are Suspects (Score:2)
Given these generative AI systems take pictures from their source, or at least very significant parts of the pictures, people will be putting in descriptions and it will act like a search engine to suggest people from their source dataset.
In fact, that would probably be a _much_ more likely/useful tool: text-based attribute search of faces already in a system. In real-time, give the best matching faces for "resting dick-face with a tattoo on the cheek". They get 20 matches back. They say, "Hmm, no, they als
Comment on sketch from victim... (Score:1)
Victim: I said my attacker was white and in his 30's.
Police: Are you sure it wasn't a young, black male as shown in this sketch?
Hypothetically speaking... (Score:2)
Experts are horrified ... about being replaced (Score:2)
I'm pretty sure the same experts who are most horrified by this are also the ones most likely to be made irrelevant by it.
Also, don't really get the race thing, aren't police sketches normally black and white?
The modern racist Communist mind never fails to am (Score:1)
"Bias good when it results in lobotomized AI systems like ChatGPT that are structurally incapable of praising the accomplishments of white people because that would be racist."
"Bias bad when it might result in accurate descriptions of criminals."
LOL
Other terrible computer sketches (Score:2)
- witness reliability is questionable - (Score:2)
Many studies have proven this. Even very obvious facts seen by witnesses are often distorted during questioning. No human or AI can turn their testimony into useful information.
In my case (forgive me Mom), I can't even describe my mother's face well enough for an artist or AI to render a useful drawing. How would I describe a criminal after seeing her for a few seconds in a stressful situation?
Hypnosis can often bring out useful information, but bureaucracies don't dare use it because it sounds like magical
Re: (Score:2)
Correct. Also, watch the ID channel and you'll find out how many people submit to polygraphs and, yet, it's identified as a pseudoscience. And I think Hypnosis is on that list too.
What's funny is that if you take the specifics out of everyone's posts here, you're all right, but here's the bugger of it:
There's a certain zeal and laziness to authority (specifics changing over the years)
There's always a marginalized group that's being oppressed and scapegoated (varies, but humans seem to need one)
These have
Good sketches coudl reinforce incorrect memories (Score:3)
Expect this to happen a lot (Score:2)
Witness: "Wow, super detailed image! That's exactly what he looked like!"
OpenAI agrees... of course. (Score:1)
AI trying to replace ... (Score:2)
...a job they don't understand ... it's someone who can tease out a good enough description from a witness without bias, and without changing their memories of what hey saw ... and then sketching it ... ...
The job of a police sketch artist is NOT someone who can sketch well
They have extracted the easy bit of the job, and ignored the difficult interacting with fallible humans bit
Unconvincing criticism (Score:2)
The criticism about this approach seems to be generally about the use of police sketches. While there may be legitimate arguments against the value of police sketches, that's not what's up for debate here. In this case the question is whether there are benefits to AI-generated sketches vs non-AI ones, from the perspective of both turnaround time and accuracy. The creators seem to me to be taking a more responsible approach - they're not releasing it into the wild, they're working with police departments to
Errrr.... (Score:1)