@lain does anthropic actually believe in the risk of autonomous misalignment
i feel like thats the only reason they would do this. they already work with the USG on all kinds of evil shit and its never been a problem until the USG demanded the safety shit needs to go
i feel like thats the only reason they would do this. they already work with the USG on all kinds of evil shit and its never been a problem until the USG demanded the safety shit needs to go