Ok, so to be clear, you’re saying that AI x-risk is already partially or even mostly bundled under “We end stratified society and power disparity, or we die.”?
In fact, the race between capitalist interests to bypass safety and get operational AGI soonest is entirely about getting that power to be able to use it to hold everyone else hostage.
Yeah, fair enough, I do agree that this is largely driven by capitalism, and if we didn’t have a capitalist society we would hopefully be going about this more cautiously. Still, I feel like it’s a unique enough situation that I would consider it its own x-risk.
Ok, so to be clear, you’re saying that AI x-risk is already partially or even mostly bundled under “We end stratified society and power disparity, or we die.”?
That is a good assessment. Yes.
In fact, the race between capitalist interests to bypass safety and get operational AGI soonest is entirely about getting that power to be able to use it to hold everyone else hostage.
Yeah, fair enough, I do agree that this is largely driven by capitalism, and if we didn’t have a capitalist society we would hopefully be going about this more cautiously. Still, I feel like it’s a unique enough situation that I would consider it its own x-risk.