One thing "AI" has in common with cryptocoin that pretty much every posited scenario is disastrous to some degree, just a question of flavour and distribution of said disaster.
-
One thing "AI" has in common with cryptocoin that pretty much every posited scenario is disastrous to some degree, just a question of flavour and distribution of said disaster.
If doomers are right: disaster
If naysayers—"it's dysfunctional"—are right: disaster ("Congrats! all software is broken")
If "moderate" boosters are right: disaster (mass unemployment)
If optimists are right: disaster (also mass unemployment)
If the AGI folks are right: disaster ( "Congrats! Digital chattel slavery")
-
One thing "AI" has in common with cryptocoin that pretty much every posited scenario is disastrous to some degree, just a question of flavour and distribution of said disaster.
If doomers are right: disaster
If naysayers—"it's dysfunctional"—are right: disaster ("Congrats! all software is broken")
If "moderate" boosters are right: disaster (mass unemployment)
If optimists are right: disaster (also mass unemployment)
If the AGI folks are right: disaster ( "Congrats! Digital chattel slavery")
In some of the scenarios billionaires do really well and in others they are largely insulated from the worst outcomes. Win-win for them
For the rest of us they're all bad. The scenarios where the tech is largely dysfunctional are the optimistic ones
If you posit that "AI" won't work well enough to make up for the costs long term and we realise this early to mitigate for the harms that are already happening then we might get out of this with a functional economy after the hubbub has died down.
-
In some of the scenarios billionaires do really well and in others they are largely insulated from the worst outcomes. Win-win for them
For the rest of us they're all bad. The scenarios where the tech is largely dysfunctional are the optimistic ones
If you posit that "AI" won't work well enough to make up for the costs long term and we realise this early to mitigate for the harms that are already happening then we might get out of this with a functional economy after the hubbub has died down.
Any scenario where "AI" works well enough to be in wide use long term means mass unemployment, wholesale political capture of media and cultural industries, dysfunctional products, and other continuing large-scale harms
The thankfully bullshit "AGI" scenario would just be an unending clusterfuck
Which is why "this is shit and you're a fool for using it" is the optimistic take
The best case scenario for the rest of us is that we have to clean up after a bunch of fools in the tech industry
-
J Jürgen Hubert shared this topic