Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI

Developers Created AI To Generate Police Sketches. Experts Are Horrified 115

An anonymous reader quotes a report from Motherboard: Two developers have used OpenAI's DALL-E 2 image generation model to create a forensic sketch program that can create "hyper-realistic" police sketches of a suspect based on user inputs. The program, called Forensic Sketch AI-rtist, was created by developers Artur Fortunato and Filipe Reynaud as part of a hackathon in December 2022. The developers wrote that the program's purpose is to cut down the time it usually takes to draw a suspect of a crime, which is "around two to three hours," according to a presentation uploaded to the internet. "We haven't released the product yet, so we don't have any active users at the moment, Fortunato and Reynaud told Motherboard in a joint email. "At this stage, we are still trying to validate if this project would be viable to use in a real world scenario or not. For this, we're planning on reaching out to police departments in order to have input data that we can test this on."

AI ethicists and researchers told Motherboard that the use of generative AI in police forensics is incredibly dangerous, with the potential to worsen existing racial and gender biases that appear in initial witness descriptions. "The problem with traditional forensic sketches is not that they take time to produce (which seems to be the only problem that this AI forensic sketch program is trying to solve). The problem is that any forensic sketch is already subject to human biases and the frailty of human memory," Jennifer Lynch, the Surveillance Litigation Director of the Electronic Frontier Foundation, told Motherboard. "AI can't fix those human problems, and this particular program will likely make them worse through its very design."

The program asks users to provide information either through a template that asks for gender, skin color, eyebrows, nose, beard, age, hair, eyes, and jaw descriptions or through the open description feature, in which users can type any description they have of the suspect. Then, users can click "generate profile," which sends the descriptions to DALL-E 2 and produces an AI-generated portrait. "Research has shown that humans remember faces holistically, not feature-by-feature. A sketch process that relies on individual feature descriptions like this AI program can result in a face that's strikingly different from the perpetrator's," Lynch said. "Unfortunately, once the witness sees the composite, that image may replace in their minds, their hazy memory of the actual suspect. This is only exacerbated by an AI-generated image that looks more 'real' than a hand-drawn sketch."
This discussion has been archived. No new comments can be posted.

Developers Created AI To Generate Police Sketches. Experts Are Horrified

Comments Filter:
  • by Joe_Dragon ( 2206452 ) on Tuesday February 07, 2023 @05:02PM (#63273579)

    source code or you must aquit!

  • Question (Score:5, Informative)

    by war4peace ( 1628283 ) on Tuesday February 07, 2023 @05:05PM (#63273591)

    with the potential to worsen existing racial and gender biases

    Could someone please tell me how gender biases affect sketch portrait generation?

    • Re:Question (Score:5, Insightful)

      by Anonymous Coward on Tuesday February 07, 2023 @05:14PM (#63273613)

      Their progressive brain was on autopilot, delivering the "race and gender" bit and forgot that nobody really cares that men heavily outweigh women in crime stats.
      Maybe they're worried a very ugly woman might be considered a man?

    • The data it was trained on was bias. Therefore it will output bias values.
      • Please elaborate/exemplify, especially in relation to "gender bias".

      • by drnb ( 2434720 )

        The data it was trained on was bias. Therefore it will output bias values.

        It draws what it is told to draw. A witness has to describe/pick features.

        • A sketching tool does that. An AI tool presumably uses information from some huge training set and that set could easily be biased
          • by drnb ( 2434720 )

            A sketching tool does that. An AI tool presumably uses information from some huge training set and that set could easily be biased

            This is a sketching tool. It is driven by human inputs. The AI has to do with photorealistic output rather than pencil sketch output. In either case the characteristic are from user descriptions or selections. Selections as in it generates two images, say with different eye shapes, and then alternates presenting the two asking which set of eyes is closer.

        • by AmiMoJo ( 196126 )

          The art of prompting AI to generate the image you want is a whole art-form in itself. Most of the questions on the various AI related subreddits are along the lines of how "how do I prompt it to draw X" or "how do I stop it generating Y". There are even some academic studies into the subject, and one of the main services that OpenAI sells is assistance with writing prompts.

          The humans who draw police sketches are, or should, be trained to avoid giving the witness feedback that influences them. Part of that i

          • by drnb ( 2434720 )
            What we are looking at is a rushed hack-a-thon prototype. They even state in the video they plan to make it more interactive, with witness feedback, as when sketching with a human artist. Also the internal model describing the features is entirely from human input. The AI is really taking this internal model, roughly equivalent to a simple 2D sketch from physical identikit, and turning that into a photorealistic image. Its this photorealistic image that is really the product of the AI, not the underlying sk
      • by bjwest ( 14070 )
        The first things asked should be the race and sex of the perp, so the AI would have no more bias in drawing the sketch than a human artist would. It's kind of hard to be biased when you're drawing the race and sex of what you're being directed to draw.
    • Re: (Score:3, Insightful)

      by Anonymous Coward

      with the potential to worsen existing racial and gender biases

      Could someone please tell me how gender biases affect sketch portrait generation?

      There is no gender or race "bias" in police sketches.

      You will almost certainly know if you were robbed by a man or women, and "I was robbed by a black guy" is a statement of fact, not a "racial bias". This is just another attempt to hide the fact that certain people commit a hugely disproportionate mount of crime.

      • While I am inclined to give some leeway to the idea that race bias could be a thing (e.g. tendency to unconsciously assign a darker skin shade to a suspect), the gender bias objectively threw me off.
        "I swear, officer, he was the manliest man I have ever seen!"

      • This is just another attempt to hide the fact that certain people commit a hugely disproportionate mount of crime.

        Yep. Poor people.

    • with the potential to worsen existing racial and gender biases

      Could someone please tell me how gender biases affect sketch portrait generation?

      Research has shown that humans remember arguments holistically, not word-by-word. A writing process that relies on individual word descriptions like this reporter can result in an argument that's strikingly different from the intended subject.

      (i.e. the same cognitive biases that caused the reporter to write "racial and gender biases" also causes sketch portrait generation to be subject to bias)

      • How often is it to witness a crime performed by males (or females) and turn their gender around when asked to help with a police sketch?

        • How often is it to witness a crime performed by males (or females) and turn their gender around when asked to help with a police sketch?

          That's not the point I was making.

          The human brain is full of bias and loves making shortcuts.

          One of those shortcuts, "we're talking about bias so I'll just say 'racial and gender biases' because that usually covers them".

          Another shortcut, "oh the perpetrator looked kinda dark, oh you're showing me a picture of a black gang member, yeah, that must be them".

    • with the potential to worsen existing racial and gender biases

      Could someone please tell me how gender biases affect sketch portrait generation?

      Can someone tell me how racial biases affect sketch portraits? Any more than height or build? I understand how people can confuse some similar ethnicities, but in the end, either the sketch looks like the perp or it doesn't. And then if someone is questioned or put in a lineup because of a sketch description, they still have to be eyeballed by a flesh and blood witness for confirmation.

      Maybe, say, an old white lady is afraid of them big, scary black men, but if the perp is a muscular, 6ft tall black man, th

      • Can someone tell me how racial biases affect sketch portraits?

        "It was some Mexican guy. I didn't get a good look, but his skin wasn't white [salon.com]."

        "Oh he was clearly a Mexican, or one of those people. No doubt about it [youtube.com]."

        • That's bias in the witness description, not the sketch portrait. Given that the portrait is based on the description, that means there will be bias in the sketch, but the bias was not introduced in the production of the sketch but in the description. I don't see how you can solve that by focusing on the production of the sketch, unless you have the sketch intentionally disregard the witness description, which opens a whole other can of worms.

      • People tend to paint perpetrators in thicker features. Darker skin shade, more prominent racial features, etc.
        These mistakes are well documented.
        But gender?

    • with the potential to worsen existing racial and gender biases

      Could someone please tell me how gender biases affect sketch portrait generation?

      Their objections are a Pavlovian reflex, which is unsurprising.

      They seemed unconcerned about whether it will help capture violent offenders, which, regrettably, is also unsurprising.

    • by ljw1004 ( 764174 )

      Could someone please tell me how gender biases affect sketch portrait generation?

      My wholly uninformed guess: there's a wide spectrum of gendered appearances you might draw (e.g. for women from butch, feminine, androgynous). They're all women. But preconceptions or biases in the training set, perhaps triggered off keywords in the subject's description, might bias what gets drawn. For instance if a victim reports "she hit me violently with a lead pipe" the machine-learning system might have formed associations between "violent" and butch-looking women. Just a guess.

      • It's still the same gender.
        Now, if they said "appearance bias" - then it would make sense.

        • by ljw1004 ( 764174 )

          It's still the same gender.

          Lots of people use the word "gender" to mean binary male/female based on chromosomes or primary sexual characteristics. Others use it to refer to a continuum of the socially constructed expectations. My suggestion was that they were using it in the latter sense, in which case (say) presenting as androgynous would be a matter of gender, hence why the post makes sense. You used it in the first sense in which case it doesn't make sense.

    • with the potential to worsen existing racial and gender biases

      Could someone please tell me how gender biases affect sketch portrait generation?

      Maybe the headline was generated by ChatGPT, which uses the most statistically sound phrases.

      However on a serious note, the problem is with human memory - it's fragile and suggestive, and such realistic sketches will influence witnesses - the end of the headline addresses the problem well.

    • by tlhIngan ( 30335 )

      Could someone please tell me how gender biases affect sketch portrait generation?

      Eyewitnesses are the most unreliable witnesses around - they almost always get it wrong, and if the perpetrator is of a certain skin tone, well, simple suggestion will then make it true. Having someone say "The guy who attacked was black" makes all the witnesses say he's black, even if he was a different skin tone.

      Heck, once a sketch is released, all eyewitness accounts for the culprit will start to look like the sketch.

      This is

    • You're right, of course - there is no room for gender bias. But, for that matter, where's the race bias? If a witness says "white" or "black", or "Hispanic", or whatever - the sketch should reflect that information.

      The only question worth asking is: does it work as well as a human sketch artist?

      • Race bias does exist and is well documented.
        Now, whether it works as well as a human sketch artist... My opinion is that it can, as long as it's properly used, much like any other tool.

    • They wouldn't. But then this entire line of argumentation is utter garbage. When I see critics falling back on vague statements about racial biases and imagined possibilities of disproportionate impacts, it makes me wonder what their real concern is and what's so wrong about it that they can't come up with a legitimate argument.

      In this case, my guess is that the critics are working for the human sketch-artists who might lose their jobs if this works better; even though the same criticisms apply to them

    • It doesn't. It's just the stupid nannerings of progressive woke insanity to see injustice everywhere. If these dipshits had their way any description of a criminal would be, an "Anthropomorphic being" of some nebulous gender and that's it.
  • "The program asks users to provide information either through a template that asks for gender, skin color, eyebrows, nose, beard, age, hair, eyes, and jaw descriptions or through the open description feature, in which users can type any description they have of the suspect"

    Yes, yes, yes, yes, yes, yes, yes, yes, and yes - the criminal had all those things.

  • by cuda13579 ( 1060440 ) on Tuesday February 07, 2023 @05:13PM (#63273611)

    It's not as if convictions are based solely on sketches. This is another attempt to make a controversy out of a non-controversy.

    If a witness saw someone that was pink-with-purple-polka-dots, they describe a suspect that was pink-with-purple-polka-dots, and the image generator/artist draws someone that is pink-with-purple-polka-dots...isn't that how it's supposed to work?

    • by Impy the Impiuos Imp ( 442658 ) on Tuesday February 07, 2023 @05:20PM (#63273635) Journal

      None of the complaints matter. Does it do a good job?

      This can be tested. Take a subject and let 100 people feed their own descriptions to the AI. Then use that to help identify people via face recognition of a big ass database of random people. If you're worried about race, limit the search dbc only to people in the general vicinity of the witness' claimed race.

      How many face descriptions (100 different ones) led to a picture that got the correct answer?

      Now repeat that with another hundred test subjects.

      • None of the complaints matter. Does it do a good job?

        This can be tested. Take a subject and let 100 people feed their own descriptions to the AI. Then use that to help identify people via face recognition of a big ass database of random people. If you're worried about race, limit the search dbc only to people in the general vicinity of the witness' claimed race.

        How many face descriptions (100 different ones) led to a picture that got the correct answer?

        Now repeat that with another hundred test subjects.

        That's the wrong test.

        Here's a better test. Once you've got a match (true or false) go back to the witness and ask if it's the same person they saw.

        I'm guessing the person who saw the hyper-realistic sketch (that totally looks like the person they found) is way more likely to say it's the correct ID whether or not it is the correct ID.

        The risk with a sketch artist is always that you end up changing the memory you're trying to capture, hyper-realistic sketches make this risk worse.

        • The proper test is to run it through previous cases that were actually solved and use the descriptions from those cases to see if it generates anything that looks like the convicted perpetrator.

          We have previous cases with actual results - use them.
        • Re: (Score:3, Insightful)

          by Rockoon ( 1252108 )
          You need to be careful to never let the witness see the AI generated depiction

          ...and thats the problem, you have to let them see it.

          The more realistic the depiction, the more it colors the witnesses memory. Most of the time they will have viewed the depiction far longer than the brief moment they viewed the perpetrator
      • by AmiMoJo ( 196126 )

        Problem is that face recognition is well known to have racial bias. Accuracy rates for dark skin are much lower than for light skin.

        You would also need to include tests where the subject is not in the database at all, and train the AI with a much larger set of data.

        Another issue is that a lot of the training databases are full of biased descriptions, which the witnesses will also give. Phrases like "angry/dangerous looking", or "dark skinned". The police are actually well aware of this - "BBG, BBG" has beco

      • by Tom ( 822 )

        It isn't that easy. Memory doesn't work the way a video camera does, especially if the situation is stressful or the opposite. How many faces of people from your daily commute do you remember?

        False memories are real and have been extensively researched. So has the fact that memory can become fragmented during high-stress situations. We also know that recalling a memory (such as when telling or retelling a story) changes that memory.

    • Part of the problem is that once the system latches on to someone, police and prosecutors have every incentive to continue trying to pin it on that someone.

      It's a system that rewards finding people guilty, rather than finding guilty people. If history is any guide, this new capability will be used to reinforce that system rather than improve it (It's from a computer! It must be right! The witnesses that disagree are simply wrong!)

      A current example in 'the system can never admit a mistake' - tomorrow, the st

    • by quantaman ( 517394 ) on Tuesday February 07, 2023 @06:20PM (#63273807)

      It's not as if convictions are based solely on sketches. This is another attempt to make a controversy out of a non-controversy.

      If a witness saw someone that was pink-with-purple-polka-dots, they describe a suspect that was pink-with-purple-polka-dots, and the image generator/artist draws someone that is pink-with-purple-polka-dots...isn't that how it's supposed to work?

      Not entirely, but they are based on eye-witnesses, and that's what this AI can really screw up.

      The summary completely buries the lead but the key quote is here:
      "Unfortunately, once the witness sees the composite, that image may replace in their minds, their hazy memory of the actual suspect. This is only exacerbated by an AI-generated image that looks more 'real' than a hand-drawn sketch."

      Think of it this way, you witnessed the perpetrator but you just got the one look at them and are now relying on memory. Now the sketch AI comes up with this "hyper-realistic" photo that looks similar, well now instead of the perpetrator you start remembering the "hyper-realistic" photo.

      And when the cops arrest someone on the basis of that photo and ask you to ID them, well your memory of the perpetrator has now been replaced by your memory of the photo and you honestly answer "yup, that's the person".

      Now you've got an eyewitness identifying that innocent person as the perpetrator and you get your false conviction.

      • The problem is with witness descriptions, which are badly unreliable in general and should be mistrusted unless the person was already somehow known to the victim, not really this tool.

        But this is a tool for police to find people to investigate and then a jury can decide if it all adds up or not. So when you find the guy whose cell phone was in the area, whose likeness shows up in a video canvass of the area, whose DNA is at the scene, and who matches the sketch produced, the jury gets to decide if that's

      • by jabuzz ( 182671 )

        There is a simple solution to that, which AI is an amazing enabler of because it requires so little human effort to produce the image. Have the AI produce a range of images that match the description and *THEN* get the witness to pick the best one. Problem solved, and in fact, I would say it is better than a human sketch artist coming up with just one image with all the sketch artist's biases. Given this system is not yet being used by anyone and is only a prototype, then these are just the sorts of things

        • There is a simple solution to that, which AI is an amazing enabler of because it requires so little human effort to produce the image. Have the AI produce a range of images that match the description and *THEN* get the witness to pick the best one.

          You're just creating more images to confuse the witness. Every time you try to read from that memory you're corrupting it a bit, especially if you're showing them a photo.

          Problem solved, and in fact, I would say it is better than a human sketch artist coming up with just one image with all the sketch artist's biases. Given this system is not yet being used by anyone and is only a prototype, then these are just the sorts of things you expect to be worked out before it goes into actual use.

          They're just feeding the text to ChatGPT, I don't think they're in a position to make many improvements to their prototype because it's built upon an AI that's not designed for that and they have no capability to modify.

          And the sketch artist certainly does have biases, but they're also in an iterative process with the witness, trying to re

    • To say nothing of the fact that witness sketches aren't drawn in a vacuum.
      At least the way it works with humans is you tell them some stuff, they draw it, you edit, they adjust, etc until the result looks like who you saw.

      Presumably, the AI system would do the same - 'thicker eyebrows, more toward the nose' at which point the ai would re-gen and check again.

      This is bullshit racialism masquerading as something real.

    • It's a signal to noise ratio thing. Most of the information in the AI image is going to be noise. It's probably worse than not having a sketch at all. If the AI produces something that looks too much like a photograph, people who see it will think that it's an actual photo of the suspect - and will not recognize the suspect because they don't look like the "photo". It's going to be wrong in all kinds of little ways because you don't have control over every little detail in the image. The photorealism implie

  • That the developers just scrapped images off the internet and the AI uses them as "reference" to build the "hyper realistic sketch".
    • Isn't that what AI mean. We made a datacenter and filled it with data scraped from the internet and have created a really complex if loop.
      AIs are the toys of the rich to be turn into the tools to control even more so.
  • by Gravis Zero ( 934156 ) on Tuesday February 07, 2023 @05:24PM (#63273647)

    “The problem with traditional forensic sketches is not that they take time to produce (which seems to be the only problem that this AI forensic sketch program is trying to solve). The problem is that any forensic sketch is already subject to human biases and the frailty of human memory,”

    Where were the objections to police sketches before AI became involved?

  • by Roger W Moore ( 538166 ) on Tuesday February 07, 2023 @05:26PM (#63273655) Journal

    Research has shown that humans remember faces holistically, not feature-by-feature.

    So surely using AI can offer a huge improvement over a sketch artist here. You may give it an initial description of the person and it can then generate say 3-4 images based on that. The witness selects the one that looks closest and it generates 3-4 more based on that. This way the AI can literally train itself to generate an image of the face while the witness simply selects the closest match using their holistic memory of the face.

    I do not see why this would be any more biased than using a human sketch artist. Indeed, while AI algorithms can show bias it is much easier to identify and fix that bias than it is with a human sketch artist and once fixed everyone gets the improved version.

    • It could start with less detailed sketches and move up to more detail each iteration.

    • by quantaman ( 517394 ) on Tuesday February 07, 2023 @07:09PM (#63273957)

      Research has shown that humans remember faces holistically, not feature-by-feature.

      So surely using AI can offer a huge improvement over a sketch artist here. You may give it an initial description of the person and it can then generate say 3-4 images based on that. The witness selects the one that looks closest and it generates 3-4 more based on that. This way the AI can literally train itself to generate an image of the face while the witness simply selects the closest match using their holistic memory of the face.

      I do not see why this would be any more biased than using a human sketch artist. Indeed, while AI algorithms can show bias it is much easier to identify and fix that bias than it is with a human sketch artist and once fixed everyone gets the improved version.

      It should be noted you're describing a system that isn't remotely similar to what is proposed here.

      All they're actually doing is feeding the descriptions to DALL-E 2 and sending back the images.

      There's no back-and-forth iterative procedure, it's just DALL-E 2 generating some images based on the text.

      Here's an experiment, go to DALL-E 2 [openai.com] and type in "Tom Cruise". If your experience is like mine you'll get some images that look kinda like Tom Cruise, but are definitely not Tom Cruise.

      Now here's the kicker, retry it with a celebrity whose face you don't remember nearly as well.

      Now, when you try to remember them, are you thinking of their face, or are you thinking of the DALL-E 2 facsimile?

      This is why it shouldn't be used for actual police sketches.

      • Now, when you try to remember them, are you thinking of their face, or are you thinking of the DALL-E 2 facsimile? This is why it shouldn't be used for actual police sketches.

        That is an argument against all police sketches not against one method vs another.

        • Now, when you try to remember them, are you thinking of their face, or are you thinking of the DALL-E 2 facsimile? This is why it shouldn't be used for actual police sketches.

          That is an argument against all police sketches not against one method vs another.

          It's an issue with police sketches, but remember, police sketches are just sketches, they're not photo-realistic so they won't have nearly the same impact.

          This proposed method is presenting users with what looks like photos, the potential to alter memories is much, much greater.

  • I'm not so sure. If the witness is able to reject/accept adjustments faster then doesn't that allow them to get to a good representation faster, giving them less time to forget what they saw? Doesn't a realistic image make the bias the police officer might have less of a factor since they have less room for interpretation?

  • This is that stupid thing where "experts" are horrified that reality hasn't been perfectly equitable, so some races, historically disadvantaged, have ended up with higher levels of criminality. And so they pretend that the way to equalize the situation is to ignore the criminals in some areas, which ironically keeps those areas disadvantaged and perpetuates the problem.
  • by Cinder6 ( 894572 ) on Tuesday February 07, 2023 @05:41PM (#63273707)

    Assuming the data set is "people" and not "convicts", I'd expect the output to be less biased than a human sketch artist.

    If you tell the artist the person who mugged you was of race X, and the artist is unconsciously biased against race Y, they might unintentionally slip in physical characteristics associated with race Y. A model trained on a representative sampling of "people" presumably wouldn't make this mistake.

    Beyond this, it's not immediately clear how much of a problem "sketch artist bias" is. How often are people wrongly arrested due to a sketch? How frequently are people convicted due to one?

    From the headline, I expected the "experts" in question to be sketch artists who are afraid of losing a job. If I were more conspiracy minded, I'd think that such sketch artists are encouraging this silly outcry.

  • So it just means the AI needs to be trained to ask the same questions as 'the experts', who ofcourse also have their own bias when drawing these sketches. I'm pretty confident that AI will do it even much better than a human. But hee, the expert needs to keep his/her job so ofcourse they think it is dangerous.
  • America's Most Wanted created a problem in that their re-enactors and people who looked like them were being reported as the fugitives, not the actual fugitives, even after John Walsh said "THIS is the guy we're after, not the guy doing the re-enactment.". I have a belief that this AI may do the same thing due to the extreme realism of the sketches. Current sketches are vague enough to identify what someone may look like but are obviously not perfect and not intended to be.
  • They all looked like Chief Wiggum?

  • Given these generative AI systems take pictures from their source, or at least very significant parts of the pictures, people will be putting in descriptions and it will act like a search engine to suggest people from their source dataset.

    In fact, that would probably be a _much_ more likely/useful tool: text-based attribute search of faces already in a system. In real-time, give the best matching faces for "resting dick-face with a tattoo on the cheek". They get 20 matches back. They say, "Hmm, no, they als

  • Victim: I said my attacker was white and in his 30's.

    Police: Are you sure it wasn't a young, black male as shown in this sketch?

  • If I see a Latino person commit a crime, when it comes to a police sketch, regardless it is by AI or by a sketch artist, I am not going to say this person was white or black. If I have to skew the truth due to my racial bias, no matter hiw you generate that drawing, it is syill subjected to the same bias. This is just a tempest in a teapot by race-mongers, nothing more. If you don't want bias, then don't resort to skettchig the subjects.
  • I'm pretty sure the same experts who are most horrified by this are also the ones most likely to be made irrelevant by it.

    Also, don't really get the race thing, aren't police sketches normally black and white?

  • "Bias good when it results in lobotomized AI systems like ChatGPT that are structurally incapable of praising the accomplishments of white people because that would be racist."

    "Bias bad when it might result in accurate descriptions of criminals."

    LOL

  • Remember the Identigraph [youtube.com] from For Your Eyes Only?
  • Many studies have proven this. Even very obvious facts seen by witnesses are often distorted during questioning. No human or AI can turn their testimony into useful information.

    In my case (forgive me Mom), I can't even describe my mother's face well enough for an artist or AI to render a useful drawing. How would I describe a criminal after seeing her for a few seconds in a stressful situation?

    Hypnosis can often bring out useful information, but bureaucracies don't dare use it because it sounds like magical

    • Correct. Also, watch the ID channel and you'll find out how many people submit to polygraphs and, yet, it's identified as a pseudoscience. And I think Hypnosis is on that list too.

      What's funny is that if you take the specifics out of everyone's posts here, you're all right, but here's the bugger of it:
      There's a certain zeal and laziness to authority (specifics changing over the years)
      There's always a marginalized group that's being oppressed and scapegoated (varies, but humans seem to need one)
      These have

  • by joe_frisch ( 1366229 ) on Tuesday February 07, 2023 @10:15PM (#63274305)
    People are not great at memorizing faces that were only glimpsed briefly. The risk of a sketching tool that produces very life-like images is that the victim may help create a sketch, and then that hyper-realistic sketch fills in the blanks in the victim's memory. They start to believe the sketch is reality.
  • Police: "Hmmm, I think the AI needs some work. It created an image of a suspect with three eyes and thirteen fingers."
    Witness: "Wow, super detailed image! That's exactly what he looked like!"
  • Describing the perpetrator of a crime solely based on their race and gender can perpetuate harmful stereotypes and contribute to racial and gender biases. This type of language can also lead to discrimination and prejudice, as well as hindering the investigation of a crime by potentially steering attention away from other suspects who do not fit the racial or gender description. Additionally, such descriptions can contribute to further mistrust between law enforcement and marginalized communities. It is im
  • ...a job they don't understand
    The job of a police sketch artist is NOT someone who can sketch well ... it's someone who can tease out a good enough description from a witness without bias, and without changing their memories of what hey saw ... and then sketching it ...
    They have extracted the easy bit of the job, and ignored the difficult interacting with fallible humans bit ...

  • The criticism about this approach seems to be generally about the use of police sketches. While there may be legitimate arguments against the value of police sketches, that's not what's up for debate here. In this case the question is whether there are benefits to AI-generated sketches vs non-AI ones, from the perspective of both turnaround time and accuracy. The creators seem to me to be taking a more responsible approach - they're not releasing it into the wild, they're working with police departments to

  • Ok...so, let me get this straight. Anything the AI does will be based on user input...why do we have to worry about race and gender bias? It's not like they're going to tell the AI to generate a white man and a black woman will pop out, unless it's REALLY fucked up. I'd be more worried about the AI getting small details wrong that could convict the wrong person...I mean, have you seen AI art?

I cannot conceive that anybody will require multiplications at the rate of 40,000 or even 4,000 per hour ... -- F. H. Wales (1936)

Working...