These Democrats Think the Party Needs AI to Win Elections


The 2024 election cycle saw artificial intelligence deployed by Political campaigns for the first time. While candidates mostly avoided serious misdeedsThe TE Techniko was used with little guidance or Limitation. Now, the National Democratic Training Committee (NDTC) devises the first official game book, which makes the case that democratic campaigns can use AI responsibly before midnight.

In a new online training, the committee has organized a plan for democratic candidates to exploit AI to create social content, write voting messages and explore their districts and opponents. Since the founding of NDTC in 2016, the organization says, it has trained more than 120,000 Democrats seeking political office. The group offers virtual lessons and personnel bootcamps training would be democratic politicians about everything, from voting registration and fundraising to data management and field organization. The group mostly aims at smaller campaigns with fewer resources with their AI course, looking to authorize what could be five-person teams to work with the “performance of a 15-person team.”

“AI and responsible ai -adoption is a competitive necessity. It’s not a luxury,” says Donald Riddle, a senior educational designer at the NDTC. “There is something we need our students to understand and feel comfortable to carry out so that they have that competitive edge and push a progressive change and push that needle when using these tools efficiently and responsibly.”

The three-part training includes an explanation of how AI works, but the meat of the course revolves around possible AI use cases for campaigns. Specifically, it encourages candidates to use AI to prepare text for various platforms and uses, including social media, emails, speeches, phone bank scripts and internal training materials that are reviewed by people before being published.

The training also points out that Democrats do not have to use AI and discourage candidates to use AI to deepen their opponents, replace real people, or create pictures and videos that could “deceive voters by misrepresentation of events, individuals or reality.”

“This prevents democratic discourse and voting trust,” the training reads.

It also advises candidates against replacing human artists and graphic designers with AI to “maintain creative integrity” and support working creators.

The final section of the course also encourages candidates to spread AI use when content presents AI-generated votes, exits as “deeply personal”, or is used to develop complex political positions. “When AI significantly contributes to political development, transparency builds trust,” it reads.

These disclosures are the most important part of the training to Hany Farid, a generative AI expert and UC Berkeley Professor of Electrical Engineering.

“You need to have transparency when something is not real or when something was completely generated,” Farid says. “But the reason for this is not only that we reveal what is not real, but also so that we trust what is real.”

When using AI for video, the NDTC suggests that campaigns use tools such as a description or Opus clip to place scripts and quickly edit content for social media, strip video clips of long breaks and clumsy moments.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *