CALIFORNIA HAS become the first state to specify that 'bots' cannot be made to impersonate humans, and must disclose themselves on their first appearance, much like Bill Cosby.
Senate Bill 1001 (or nine, if you're a robot) was signed into law by Governor Jerry Brown. It is designed to stop devices like Google Duplex being able to masquerade as human beings.
The ban bans such activity both at a cutting-edge level and at a 'Trump? Really?' level by ensuring AI cannot "incentivise a purchase or sale of goods or services in a commercial transaction or to influence a vote in an election,"
Although the bill is largely aimed at avoiding potential vote rigging and the like, the fact is that with robots abler than ever to have a believable conversation with us, the legislation is going to get more and more wheeled out.
Google has voluntarily said that it will make Duplex disclose its synthetic nature during calls it makes, but under this law, that would become a requirement.
The point is not to ‘ban-the-bot' - that would be almost impossible - but rather to limit the powers of those who seek to abuse the technology. The trouble is, if a bot is behind enemy lines, and indeed outside the jurisdiction of the law, it's not clear exactly how sharp California's teeth could possibly be.
Senator Brown even has a bot of his own on Twitter, and whilst its views may be a bit partisan, the point is, it is completely transparent about being a bot.
Brown is the same governor who also passed the state's net neutrality rules into law, overriding those set out by The Idiot Pai early in the year. The state is already being sued by the federal government, because America.
Thing is - if this is even necessary, surely we're past the level of The Turing Test? And isn't that a bit scary?
We predict in about 80 years, a robot will be arrested for non-disclosure because they didn't know themselves. μ
Store will be shuttered over the 'coming weeks'
But devs will need to wait until 2021, supposedly
Now you can hack with confidence
Promises that it wasn't used without permission