You must log in or # to comment.
- So it will report it to the police when it sees too much “threatening content” but won’t report it when someone is literally asking for advice on how to kill themselves. Gotcha. 
- Did people actually think anything they typed to ChatGPT was going to be private and confidential? - It’s like googling how to do something illegal. What the fuck else were you expecting to happen? 
 
- If you’re going to use an LLM at least run it locally jfc 



