Or here’s another idea. If this is actually a real threat, how about we treat it like one? We can simply choose not to develop certain technologies; we’ve done it multiple times. We’ve had the tech for human cloning for decades, but we decided it was unethical and to simply not pursue the technology. We could do the same for AI beyond a certain level of complexity.
Hell, if this really is a threat to the human race, I would fully support just outlawing computers entirely if that’s what it took. Fuck it, we’ll just go back to pen and paper. It would be an extreme step, but if that’s what it takes, so be it. We can go full Dune, “thou shall not make a machine in the likeness of the human mind.”
Or here’s another idea. If this is actually a real threat, how about we treat it like one? We can simply choose not to develop certain technologies; we’ve done it multiple times. We’ve had the tech for human cloning for decades, but we decided it was unethical and to simply not pursue the technology. We could do the same for AI beyond a certain level of complexity.
Hell, if this really is a threat to the human race, I would fully support just outlawing computers entirely if that’s what it took. Fuck it, we’ll just go back to pen and paper. It would be an extreme step, but if that’s what it takes, so be it. We can go full Dune, “thou shall not make a machine in the likeness of the human mind.”
At the very least we ought to ban wasting more than a some number of GWh/year on training the same model as last year but with 100x parameters