Kid@sh.itjust.worksM to Cybersecurity@sh.itjust.worksEnglish · 2 months agoA nearly undetectable LLM attack needs only a handful of poisoned samples - Help Net Securitywww.helpnetsecurity.comexternal-linkmessage-square2linkfedilinkarrow-up138arrow-down12
arrow-up136arrow-down1external-linkA nearly undetectable LLM attack needs only a handful of poisoned samples - Help Net Securitywww.helpnetsecurity.comKid@sh.itjust.worksM to Cybersecurity@sh.itjust.worksEnglish · 2 months agomessage-square2linkfedilink
minus-squareKairos@lemmy.todaylinkfedilinkEnglisharrow-up10·2 months agoIs the “attack” the fact that LLM fundamentally can’t distinguish between instructions and data?