To my mind the IP theft is in a very different category than the autonomous copying scenarios. An IP theft _under human control_ is, to my mind a rather minor event. If a grad student at a computer lab in MIT circumvents the license fee for an LLM and gets access that their department hasn't paid for, but just uses it in their research, that seems no more serious (except financially) than if their department _had_ paid for it. This has a "murder, arson, and jaywalking" feel to it.
Autonomous copying, with no human agency involved, looks far more serious.
An interesting intermediate case is where massive numbers of AIs are intentionally (by humans) copied into separate systems (e.g. drone weapons) - but the risk that the copying controls might be subverted by another AI system gets high due to the sheer number of (originally legitimate) copies involved.
While autonomous self-replication is in itself very serious, IP theft is not necessarily disastrous, but could lead to really bad scenarios. Like mentioned in the post: misuse, intensifying arms race dynamics, and autonomous self-replication are some potential consequences.
Many Thanks! One comment about the particular case of an arms race: I see this as orthogonal to IP theft, with only the fairly minor effect that IP theft could increase the number of competing organizations in the race. _Mostly_, arms races are generally between organizations with legally legitimate access to their arms.
You might be interested in the discussion that I and TT and John Schilling had on arms control of AI, starting at https://www.astralcodexten.com/p/open-thread-368/comment/92858848 tl;dr; Nuclear is the _good_ case for arms control, and it still failed to stop pariah state North Korea from getting nukes. AI is far far less favorable.
To my mind the IP theft is in a very different category than the autonomous copying scenarios. An IP theft _under human control_ is, to my mind a rather minor event. If a grad student at a computer lab in MIT circumvents the license fee for an LLM and gets access that their department hasn't paid for, but just uses it in their research, that seems no more serious (except financially) than if their department _had_ paid for it. This has a "murder, arson, and jaywalking" feel to it.
Autonomous copying, with no human agency involved, looks far more serious.
An interesting intermediate case is where massive numbers of AIs are intentionally (by humans) copied into separate systems (e.g. drone weapons) - but the risk that the copying controls might be subverted by another AI system gets high due to the sheer number of (originally legitimate) copies involved.
I partly agree.
While autonomous self-replication is in itself very serious, IP theft is not necessarily disastrous, but could lead to really bad scenarios. Like mentioned in the post: misuse, intensifying arms race dynamics, and autonomous self-replication are some potential consequences.
Many Thanks! One comment about the particular case of an arms race: I see this as orthogonal to IP theft, with only the fairly minor effect that IP theft could increase the number of competing organizations in the race. _Mostly_, arms races are generally between organizations with legally legitimate access to their arms.
You might be interested in the discussion that I and TT and John Schilling had on arms control of AI, starting at https://www.astralcodexten.com/p/open-thread-368/comment/92858848 tl;dr; Nuclear is the _good_ case for arms control, and it still failed to stop pariah state North Korea from getting nukes. AI is far far less favorable.