CommentPut the output in jail (Score 1)37
I of course read nothing here but it sure sounds like these jailbreaks are all about clever ways to break through input guards. Why aren't we just trapping the output for harmful content instead. What's the harm in discarded AI dreams nobody can see? There might even be some benefit...