comment by jbash ·
2019-02-15T13:05:09.424Z · LW(p) · GW(p)
So, there are several things that might be "property" here.
The method is probably patentable. The trained network is definitely NOT copyrightable by the clear intent of the copyright law, because it's obvious to any honest interpreter that it's nothing like a "creative work". However, based on their track record, if you took it to the Federal Circuit, they'd probably be willing to pervert the meaning of "creative work" to let somebody enforce a copyright on it based on curation of the training data or something equally specious. They may already have done that in some analogous case.
Property rights in patents or copyrights are separate from property rights in actual devices, copies of networks, or whatever. I can own a book without owning the copyright in the book. And if you own the copyright, that does NOT allow you to demand that I give you my copy of the book, even if you don't have a copy yourself.
The nuclear bomb case would involve a "patent secrecy order"... a power which was in fact created exactly for nuclear bombs. I don't think there's such a thing as a "copyright secrecy order".
They could also probably forcibly buy any patent (yes, under eminent domain). Eminent domain is NOT a "requisition", because eminent domain in the US requires compensation as a constitutional matter. I also don't know if they have any processes in place for exercising eminent domain in the case under discussion, and I doubt they do. Some particular agency has to be authorized and funded to exercise a power like that in any given case.
Even if the government forcibly bought a patent or copyright, that by itself would not entitle the government to be given a copy of the subject matter. I don't know if bits, as opposed to the media they were on, would even be "property".
... but if you REALLY want to go there, well, obviously the US Government, taken as a whole, could obviously pass a law giving itself the power to force OpenAI to hand over copies, delete its own copies, relinquish any patent or copyright rights (possibly with a requirement for money compensation for those last two), stay out of Ireland, and whatever else.
What I'm really puzzled by is the extremely counterfactuality of the question. It just doesn't seem to have any connection at all with how people or institutions actually behave. A neural network that can sound like somebody isn'tt a nuclear bomb, and the political dynamics around it are completely different.
The upper echelons of the US Government won't notice it at all.
If some researcher working for the US Government (or any government) wants a copy of the network for some reason, that person will just send a polite email request to OpenAI, and OpenAI will probably hand it over without worrying about it. If OpenAI doesn't, the question will probably die there. From a practical point of view, that researcher won't be able to make it enough of a priority for the government to even stir itself to figure out which powers might apply.
If some agency of the government suggests to OpenAI that it never release the network to anybody, and gives any kind of meaningful reason, then OpenAI will probably take that into account and comply. That's extremely unlikely, though.
Some government agency trying to actually force OpenAI not to release is farfetched enough not to be worth worrying about, but it would probably come down to timing; OpenAI might be able to release before the government could create any binding order preventing it.
comment by ioannes_shade ·
2019-02-15T15:52:19.281Z · LW(p) · GW(p)
Thanks, this is helpful.
What I'm really puzzled by is the extremely counterfactuality of the question.
It doesn't feel too extreme to me (powerful new dual-use technology), but probably our priors are just different here :-)