It looks like content will have to be labeled, showing if it’s AI-generated or not.
And special rules will apply to:
any model that was trained using a quantity of computing power greater than 10^26 integer or floating-point operations, or using primarily biological sequence data and using a quantity of computing power greater than 10^23 integer or floating-point operations; and
any computing cluster that has a set of machines physically co-located in a single datacenter, transitively connected by data center networking of over 100 Gbit/s, and having a theoretical maximum computing capacity of 10^20 integer or floating-point operations per second for training AI.
Also, easier visas for “AI talent”.
This is ridiculous, but very good that it gives China and Europe an in to catch up!
This 10^26 ops threshold appears to amount to 10’000 years of an A100 at 312 teraFLOPS.
Does this even apply to anybody then, or is this only relevant going forward? Did even GPT4 need that type of power?
The CEO said it had cost “much more than 100M” of compute to train.