𝕯𝖎𝖕𝖘𝖍𝖎𝖙

Troll honeypot, apparently.

Suggested blocks:

  • @lemm.ee:
    • @ByteWizard (troll)
    • @Texas_Hangover (troll)
  • @lemmy.world:
    • @neflubaguzzi (troll)
    • @fkn (troll mod)
    • @jopepa (troll)
    • @yggstyle
    • @EdibleFriend (troll)
  • @lemmy.blahaj.zone:
    • @nublug (troll)
    • @StoneGender (troll)
  • 0 Posts
  • 31 Comments
Joined 1 year ago
cake
Cake day: August 2nd, 2023

help-circle



















  • Just going to argue on behalf of the other users who know apparently way more than you and I do about this stuff:

    WhY nOt juSt UsE thE FBi daTaBaSe of CSam?!

    (because one doesn’t exist)

    (because if one existed it would either be hosting CSAM itself or showing just the hashes of files - hashes which won’t match if even one bit is changed due to transmission data loss / corruption, automated resizing from image hosting sites, etc)

    (because this shit is hard to detect)

    Some sites have tried automated detection of CSAM images. Youtube, in an effort to try to protect children, continues to falsely flag 30 year old women as children.

    OP, I’m not saying you should give up, and maybe what you’re working on could be the beginning of something that truly helps in the field of CSAM detection. I’ve got only one question for you (which hopefully won’t be discouraging to you or others): what’s your false-positive (or false-negative) detection rate? Or, maybe a question you may not want to answer: how are you training this?