Zoom, the video conference company, just updated their Terms of Service. Section 10.4 lets them use your meeting content to train machine learning and AI models.
- They could use this to duplicate your expertise, if you deliver your expertise via Zoom.
- This could put you in violation of the law if you’re in a privacy-regulated profession.
- Zoom says they’ll asked for consent first, but they don’t say how. It could be as deceptive as “You must click Accept on our updated terms.”
- I’m looking for privacy-protecting alternatives to recommend.
Does Machine Learning really duplicate expertise? Yes.Machine learning is the new hot topic in Silicon Valley. It is a statistical analysis method that uses large data sets to create computer programs that can predict things. For example, medical diagnosis systems would be given a huge number of medical cases. Each would be given the symptoms, the eventual diagnosis, and whether that diagnosis was correct.
With enough data, the system would be able to diagnose as well as (or better than) any of the doctors who provided the training data.
Getting enough high-quality data is the key to building a model that works.
Cloud services are using user data to create and “steal” those users’ product. Adobe has released truly astonishing new capabilities in their image processing program Photoshop. The latest Photoshop can fill in blank spaces in images, as if by magic.
How did the AI get that ability? By training on images that graphic designers had stored in the Adobe Cloud over the last decade.
The Terms of Service did say, buried somewhere in several dozens or hundreds of pages of legalese, that data stored in the Adobe Cloud could be used for the creation of new products and services.
I think it’s highly unlikely that (a) users knew that clause was there, (b) users understood the implications of letting Adobe train an AI on their images (implications that Adobe can now duplicate their work) (c) users thought that a cloud service, which is usually a storage product, would be used to train an AI for delivery in a separate product.
Is this theft? Legally, probably not. But morally and ethically, I use a different test: if the users of Adobe Cloud had been told “we will use any image you store with us to train an AI to be able to generate similar images without paying photographers or designers royalties of any sort,” I’m fairly sure most people would have opted out. They probably would have said “You mean you want to steal my images and design style? Hard pass.”
This is playing out right now in Hollywood. The actors’ and writers’ strike shows that when people understand that they’re being asked to give away their creative work for free, they refuse. The actors and writers are simply asking for protection against exactly this sort of AI duplication of their work and likeness.
Does one person’s data really matter? If Zoom is collecting data on billions of meetings, do my meetings really make that much difference in their machine learning models?
No. And Yes. If (as has happened) a bank accidentally charges 100,000 accounts an incorrect $10 fee, does that matter? $10 is unlikely to bankrupt anyone. But the bank just walked away with a million dollars. I think we would all call that theft.
AI companies are doing the same thing now, only with expertise rather than money.
How this affects you.
Legal effects. If you’re a doctor, trainer, consultant, therapist, lawyer, or anyone who delivers your product or service via Zoom, this means that your private conversations may be used to train a machine learning model.
Depending on your profession, there may be legal implications for you if this happens.
Ethical effects. If Zoom then releases that machine learning model in some form (or even just uses it internally), that model might produce output that is similar to the conversations you’ve had. It may use slightly different words, or omit proper names, but it may summarize your conversations or convey the gist of them. Depending on the summary, it might give enough hints that a reader could deduce the actual details.
Competitive effects. If you make your money from your advice giving, the model that Zoom builds might be used, as Adobe’s was, to produce a product that directly competes with you in the marketplace. Why hire an interior decorator when Zoom will let you pay $50 to work with an AI that’s been trained on a few hundred thousand conversations that real interior decorators had with their clients.
What to do next.
If you’re concerned about your conversations being used to put you out of business, stop using Zoom and refuse to deliver your services over anyone else’s Zoom account (they may have given consent for their account to be used for training data).
Be vigilant. These clauses are becoming more and more common with cloud services. When you use any new high tech product, search “Terms of Service” pages for the phrases “Machine Learning” and “AI” and “artificial intelligence” and see what rights you’re giving up to your own work product.
(Cars are now a high-tech product, by the way. Pay particular attention to whether the car company can monitor and record everything you do and say inside your car. Some car companies have you grant that right as a condition of buying the car.)
Stay safe out there. Because now, it’s your very own productivity tools that are trying to take you down.