The Fort Worth Press - Biden robocall: Audio deepfake fuels election disinformation fears

USD -
AED 3.673042
AFN 68.858766
ALL 88.802398
AMD 387.151613
ANG 1.799401
AOA 927.769041
ARS 961.359012
AUD 1.46886
AWG 1.8
AZN 1.70397
BAM 1.749922
BBD 2.015926
BDT 119.312844
BGN 1.749287
BHD 0.376236
BIF 2894.376594
BMD 1
BND 1.290118
BOB 6.899298
BRL 5.515104
BSD 0.998434
BTN 83.448933
BWP 13.198228
BYN 3.267481
BYR 19600
BZD 2.012526
CAD 1.35775
CDF 2871.000362
CHF 0.850342
CLF 0.033728
CLP 930.650396
CNY 7.051904
CNH 7.043005
COP 4153.983805
CRC 518.051268
CUC 1
CUP 26.5
CVE 98.657898
CZK 22.451404
DJF 177.79269
DKK 6.68204
DOP 59.929316
DZD 132.138863
EGP 48.452557
ERN 15
ETB 115.859974
EUR 0.894904
FJD 2.200804
FKP 0.761559
GBP 0.75092
GEL 2.730391
GGP 0.761559
GHS 15.696327
GIP 0.761559
GMD 68.503851
GNF 8626.135194
GTQ 7.71798
GYD 208.866819
HKD 7.790095
HNL 24.767145
HRK 6.799011
HTG 131.740706
HUF 352.160388
IDR 15160.8
ILS 3.777515
IMP 0.761559
INR 83.48045
IQD 1307.922874
IRR 42092.503816
ISK 136.260386
JEP 0.761559
JMD 156.86485
JOD 0.708504
JPY 143.90404
KES 128.797029
KGS 84.238504
KHR 4054.936698
KMF 441.350384
KPW 899.999433
KRW 1332.490383
KWD 0.30507
KYD 0.832014
KZT 478.691898
LAK 22047.152507
LBP 89409.743659
LKR 304.621304
LRD 199.686843
LSL 17.527759
LTL 2.95274
LVL 0.60489
LYD 4.741198
MAD 9.681206
MDL 17.42227
MGA 4515.724959
MKD 55.129065
MMK 3247.960992
MNT 3397.999955
MOP 8.014495
MRU 39.677896
MUR 45.880378
MVR 15.360378
MWK 1731.132286
MXN 19.416804
MYR 4.205039
MZN 63.850377
NAD 17.527759
NGN 1639.450377
NIO 36.746745
NOK 10.482404
NPR 133.518543
NZD 1.603206
OMR 0.384512
PAB 0.998434
PEN 3.742316
PGK 3.9082
PHP 55.653038
PKR 277.414933
PLN 3.82535
PYG 7789.558449
QAR 3.640048
RON 4.449904
RSD 104.886038
RUB 92.240594
RWF 1345.94909
SAR 3.752452
SBD 8.306937
SCR 13.046124
SDG 601.503676
SEK 10.170404
SGD 1.291304
SHP 0.761559
SLE 22.847303
SLL 20969.494858
SOS 570.572183
SRD 30.205038
STD 20697.981008
SVC 8.736188
SYP 2512.529936
SZL 17.534112
THB 32.927038
TJS 10.61334
TMT 3.5
TND 3.025276
TOP 2.342104
TRY 34.124875
TTD 6.791035
TWD 31.981038
TZS 2725.719143
UAH 41.267749
UGX 3698.832371
UYU 41.256207
UZS 12705.229723
VEF 3622552.534434
VES 36.777762
VND 24605
VUV 118.722009
WST 2.797463
XAF 586.90735
XAG 0.03211
XAU 0.000381
XCD 2.70255
XDR 0.739945
XOF 586.90735
XPF 106.706035
YER 250.325037
ZAR 17.38465
ZMK 9001.203587
ZMW 26.433141
ZWL 321.999592
  • CMSD

    0.0100

    25.02

    +0.04%

  • NGG

    0.7200

    69.55

    +1.04%

  • GSK

    -0.8200

    40.8

    -2.01%

  • SCS

    -0.3900

    12.92

    -3.02%

  • RBGPF

    58.8300

    58.83

    +100%

  • AZN

    -0.5200

    78.38

    -0.66%

  • RYCEF

    0.0200

    6.97

    +0.29%

  • RIO

    -1.6100

    63.57

    -2.53%

  • VOD

    -0.0500

    10.01

    -0.5%

  • CMSC

    0.0300

    25.15

    +0.12%

  • RELX

    -0.1400

    47.99

    -0.29%

  • BCC

    -7.1900

    137.5

    -5.23%

  • JRI

    -0.0800

    13.32

    -0.6%

  • BCE

    -0.1500

    35.04

    -0.43%

  • BTI

    -0.1300

    37.44

    -0.35%

  • BP

    -0.1200

    32.64

    -0.37%

Biden robocall: Audio deepfake fuels election disinformation fears
Biden robocall: Audio deepfake fuels election disinformation fears / Photo: © AFP

Biden robocall: Audio deepfake fuels election disinformation fears

The 2024 White House race faces the prospect of a firehose of AI-enabled disinformation, with a robocall impersonating US President Joe Biden already stoking particular alarm about audio deepfakes.

Text size:

"What a bunch of malarkey," said the phone message, digitally spoofing Biden's voice and echoing one of his signature phrases.

The robocall urged New Hampshire residents not to cast ballots in the Democratic primary last month, prompting state authorities to launch a probe into possible voter suppression.

It also triggered demands from campaigners for stricter guardrails around generative artificial intelligence tools or an outright ban on robocalls.

Disinformation researchers fear rampant misuse of AI-powered applications in a pivotal election year thanks to proliferating voice cloning tools, which are cheap and easy to use and hard to trace.

"This is certainly the tip of the iceberg," Vijay Balasubramaniyan, chief executive and co-founder of cybersecurity firm Pindrop, told AFP.

"We can expect to see many more deepfakes throughout this election cycle."

A detailed analysis published by Pindrop said a text-to-speech system developed by the AI voice cloning startup ElevenLabs was used to create the Biden robocall.

The scandal comes as campaigners on both sides of the US political aisle harness advanced AI tools for effective campaign messaging and as tech investors pump millions of dollars into voice cloning startups.

Balasubramaniyan refused to say whether Pindrop had shared its findings with ElevenLabs, which last month announced a financing round from investors that, according to Bloomberg News, gave the firm a valuation of $1.1 billion.

ElevenLabs did not respond to repeated AFP requests for comment. Its website leads users to a free text-to-speech generator to "create natural AI voices instantly in any language."

Under its safety guidelines, the firm said users were allowed to generate voice clones of political figures such as Donald Trump without their permission if they "express humor or mockery" in a way that makes it "clear to the listener that what they are hearing is a parody, and not authentic content."

- 'Electoral chaos' -

US regulators have been considering making AI-generated robocalls illegal, with the fake Biden call giving the effort new impetus.

"The political deepfake moment is here," said Robert Weissman, president of the advocacy group Public Citizen.

"Policymakers must rush to put in place protections or we're facing electoral chaos. The New Hampshire deepfake is a reminder of the many ways that deepfakes can sow confusion."

Researchers fret the impact of AI tools that create videos and text so seemingly real that voters could struggle to decipher truth from fiction, undermining trust in the electoral process.

But audio deepfakes used to impersonate or smear celebrities and politicians around the world have sparked the most concern.

"Of all the surfaces -- video, image, audio -- that AI can be used for voter suppression, audio is the biggest vulnerability," Tim Harper, a senior policy analyst at the Center for Democracy & Technology, told AFP.

"It is easy to clone a voice using AI, and it is difficult to identify."

- 'Election integrity' -

The ease of creating and disseminating fake audio content complicates an already hyperpolarized political landscape, undermining confidence in the media and enabling anyone to claim that fact-based "evidence has been fabricated," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

Such concerns are rife as the proliferation of AI audio tools outpaces detection software.

China's ByteDance, owner of the wildly popular platform TikTok, recently unveiled StreamVoice, an AI tool for real-time conversion of a user's voice to any desired alternative.

"Even though the attackers used ElevenLabs this time, it is likely to be a different generative AI system in future attacks," Balasubramaniyan said.

"It is imperative that there are enough safeguards available in these tools."

Balasubramaniyan and other researchers recommended building audio watermarks or digital signatures into tools as possible protections as well as regulation that makes them available only for verified users.

"Even with those actions, detecting when these tools are used to generate harmful content that violates your terms of service is really hard and really expensive," Harper said.

"(It) requires investment in trust and safety and a commitment to building with election integrity centred as a risk."

J.P.Cortez--TFWP