Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
Now what would the company do if the AI model started putting safety above profit (i.e. refusing to lie to profit the user (aka reducing market value))? How fucked are we if they create an AGI that puts profit above safety?
Now what would the company do if the AI model started putting safety above profit (i.e. refusing to lie to profit the user (aka reducing market value))? How fucked are we if they create an AGI that puts profit above safety?
Entirely. We all die. The light cone is turned into the maximum amount of “profit” possible.
This is still better than a torment maximizer, which may come as some comfort to the tiny dollar bills made of the atoms that used to be you.
You get paperclip maximizer
https://terbium.io/2020/05/paperclip-maximizer/