Dem 47
image description
   
GOP 53
image description

AI as a Political Force

Palantir, co-founded by Peter Thiel (an immigrant and major Republican donor), is a software company that sucks up vast quantities of data from disparate sources, correlates them, and provides data integration and analysis software for governments, the military, and corporations. The ACLU has said Palantir's AI can be used for predictive policing, possibly to identify (and arrest?) future criminals before they commit any crimes. Big Brother would approve. Thiel is chairman of the company and Alex Karp is CEO.

In an interview with CNBC, Karp said the quiet part out loud: "This technology disrupts humanities-trained—largely Democratic—voters, and makes their economic power less. And increases the economic power of vocationally trained, working-class, often male, working-class voters." In other words, a goal of AI is to take away the jobs of Democrats and help Republicans.

Sometimes the best-laid plans of rodents and humans don't quite work out as planned. Here are a few notes about Karp and Thiel's plans to use AI to transfer power from Democrats to Republicans. First, there are plenty of men in desk jobs pushing paper around. It is not clear that AI-induced disruption will necessarily fall more heavily on women than on men. It might be gender neutral.

Second, even if all the editors at PBS are women and they lose their good-paying jobs and have to become nurses' aides to survive, that doesn't mean they will lose their vote. Some of them who were independents might come to believe that Big Tech and the Republican billionaires who run the companies are the problem and vote accordingly.

Third, if Democrats see this happening, they may double down on pumping money into sectors where women do well, such as elementary education, child care, health care, and social services. The Barnard-educated female editors laid off at PBS may come to clearly understand who is trying to help them and who is not.

Fourth, AI may help automation kill "manly" jobs as well as "intellectual female" jobs. In China and Japan, AI has helped to create dark factories, which have (almost) no lights on because there are no humans working there, just robots. China has hundreds of dark factories. Here is a link to a video about the dark factories. Donald Trump thinks all China can do is make cheap T-shirts. This video shows the real China. It is very impressive.

AI that eliminates factory jobs hits the Republican base very hard. Also appearing in the world of manufacturing are 3-D printers, which can replace machinists. AI is also being used in mining and logging, industries with many men and few women. Another industry where AI will hit men harder than women is transportation, with self-driving taxis, trucks, buses, trains, trams, subways, boats, and planes eliminating the need for human drivers, most of whom are men.

Probably the most "manly" job of all—being a grunt in the Army—is starting to be replaced by drone programmers and operators, jobs women can do as well as men. Even the most highly skilled job in the military—piloting a jet fighter—can be done by AI. For example, the X-BAT and the Anduril Fury are fully autonomous fighter jets with no human pilot.

Karp was definitely getting ahead of his skis thinking that the only jobs on the line are those done by highly educated women in an office using a computer. AI may kill off more "manly" jobs than "womanly" jobs because many jobs that women are more likely to do, like teaching, child care and nursing, don't automate well. Also, and related, the economy is more and more focused on services rather than on products. Women tend to play a bigger role in services and men in products, and AI won't change that.

Also, if they are smart, the Democrats could make reining in AI a key part of their platform, so that when jobs are lost, voters will know who was for and who was against some bot taking over their jobs. One way to start would be to blame AI data centers for rising electricity bills, which is an easy sell. In fact, data centers are already becoming a wedge issue.

Another one is to advocate and pass laws assigning legal liability when AI gets something wrong. For example, if a patient is "examined" by an AI doctor at a hospital and the AI doctor makes a mistake and the patient is injured or dies, the law could make it clear that malpractice laws and civil lawsuits most definitely apply to AI doctors, too. In fact, the law could state that any time any AI bot causes injuries to anyone, the organization using the bots has the full legal liability that a human would have under the same circumstances and the organization using the AI can't pass the blame off to the makers of the AI software. This liability will slow the adoption of AI bots as managers will worry about getting sued if the bots make mistakes. If the Democrats stake a claim on "We don't trust AI bots" and the Republicans are "full speed ahead on AI bots," guess who gets the blame when jobs are lost and things go wrong, which is inevitable sometimes. Thanks to reader B.P. in SLC, UT for the tip about Karp. (V)



This item appeared on www.electoral-vote.com. Read it Monday through Friday for political and election news, Saturday for answers to reader's questions, and Sunday for letters from readers.

www.electoral-vote.com                     State polls                     All Senate candidates