Will A Robot Take Your Writing Job?
The pandemic has accelerated automation in non-creative fields, but are creative jobs vulnerable, too?
Since the 1700s, when Luddites started smashing English looms, automation has been putting people out of work.
In the 20th century, automation took over manufacturing jobs, data entry jobs, manual computing jobs. Now, breakthroughs in “deep learning” have exponentially increased the number of jobs that can be automated. McKinsey estimates that, by 2030, as much as 30% of work will be done by machines, displacing almost 400 million workers.
If you’re entering the job market or considering a career change, you’re probably wondering whether your job will still be around 20 years from now.
AI is now being used to make pizza, trade stocks and operate call centers (much to the chagrin of anyone trying to get in contact with a service provider). Administrative jobs and even some legal tasks, like document review and contract generation, are increasingly being handled by machines.
But what about more creative jobs?
For a long time, people have touted the relative safety of “creative” professions. While computers are already better than humans at things like processing data, sensory perception, recall, and predictable physical motion, humans outpace computers in soft skills like empathy, unstructured problem-solving, and unpredictable physical movement.
But is that changing? As neural networks improve, AI software is being used to disrupt professions once considered automation-proof.
There are now AI actors, poets, and visual artists. Even influencers, whose professions are based almost solely on being “personable,” are having to compete against digital competitors. Imma, a virtual influencer, has over 300,000 Instagram followers and has landed deals with brands like Celine.
Just this year, Open AI released GTP-3, a pre-trained language model that can be used to generate a wide range of texts ranging from emails to dialogue to even memes.
There are other platforms that generate prose — often notoriously incoherent. But GPT-3 is different.