The Fort Worth Press - Artists use tech weapons against AI copycats

USD -
AED 3.672898
AFN 68.266085
ALL 93.025461
AMD 389.644872
ANG 1.80769
AOA 911.999803
ARS 998.694492
AUD 1.5472
AWG 1.795
AZN 1.700918
BAM 1.85463
BBD 2.025224
BDT 119.861552
BGN 1.857551
BHD 0.376464
BIF 2962.116543
BMD 1
BND 1.344649
BOB 6.930918
BRL 5.79695
BSD 1.002987
BTN 84.270352
BWP 13.71201
BYN 3.282443
BYR 19600
BZD 2.02181
CAD 1.40928
CDF 2864.999753
CHF 0.887938
CLF 0.035528
CLP 975.269072
CNY 7.232503
CNH 7.236449
COP 4499.075435
CRC 510.454696
CUC 1
CUP 26.5
CVE 104.561187
CZK 23.996904
DJF 178.606989
DKK 7.08157
DOP 60.43336
DZD 133.184771
EGP 49.369421
ERN 15
ETB 121.465364
EUR 0.949715
FJD 2.27595
FKP 0.789317
GBP 0.792079
GEL 2.735007
GGP 0.789317
GHS 16.022948
GIP 0.789317
GMD 70.999794
GNF 8643.497226
GTQ 7.746432
GYD 209.748234
HKD 7.78609
HNL 25.330236
HRK 7.133259
HTG 131.85719
HUF 387.786014
IDR 15898.3
ILS 3.749298
IMP 0.789317
INR 84.47775
IQD 1313.925371
IRR 42092.503622
ISK 137.649817
JEP 0.789317
JMD 159.290693
JOD 0.709103
JPY 154.192026
KES 129.894268
KGS 86.5029
KHR 4051.965293
KMF 466.574995
KPW 899.999621
KRW 1395.925041
KWD 0.30754
KYD 0.835902
KZT 498.449576
LAK 22039.732587
LBP 89819.638708
LKR 293.025461
LRD 184.552653
LSL 18.247689
LTL 2.95274
LVL 0.60489
LYD 4.898772
MAD 9.999526
MDL 18.224835
MGA 4665.497131
MKD 58.423024
MMK 3247.960992
MNT 3397.999946
MOP 8.042767
MRU 40.039827
MUR 47.210238
MVR 15.449754
MWK 1739.225262
MXN 20.381501
MYR 4.470499
MZN 63.897764
NAD 18.247689
NGN 1665.819323
NIO 36.906737
NOK 11.107115
NPR 134.832867
NZD 1.703293
OMR 0.384524
PAB 1.002987
PEN 3.80769
PGK 4.033
PHP 58.731501
PKR 278.485894
PLN 4.107991
PYG 7826.086957
QAR 3.656441
RON 4.72391
RSD 110.944953
RUB 100.019658
RWF 1377.554407
SAR 3.756134
SBD 8.390419
SCR 13.839806
SDG 601.514208
SEK 10.98865
SGD 1.342475
SHP 0.789317
SLE 22.61917
SLL 20969.504736
SOS 573.230288
SRD 35.315503
STD 20697.981008
SVC 8.776255
SYP 2512.529858
SZL 18.240956
THB 34.905998
TJS 10.692144
TMT 3.51
TND 3.164478
TOP 2.342102
TRY 34.600496
TTD 6.810488
TWD 32.476799
TZS 2667.962638
UAH 41.429899
UGX 3681.191029
UYU 43.042056
UZS 12838.651558
VES 45.732111
VND 25390
VUV 118.722009
WST 2.791591
XAF 622.025509
XAG 0.033067
XAU 0.00039
XCD 2.70255
XDR 0.755583
XOF 622.025509
XPF 113.090892
YER 249.875002
ZAR 18.190221
ZMK 9001.202645
ZMW 27.537812
ZWL 321.999592
  • BCC

    -0.2600

    140.09

    -0.19%

  • SCS

    -0.0400

    13.23

    -0.3%

  • BTI

    0.9000

    36.39

    +2.47%

  • GSK

    -0.6509

    33.35

    -1.95%

  • CMSD

    0.0822

    24.44

    +0.34%

  • AZN

    -1.8100

    63.23

    -2.86%

  • BCE

    -0.0200

    26.82

    -0.07%

  • RIO

    0.5500

    60.98

    +0.9%

  • NGG

    0.3800

    62.75

    +0.61%

  • RBGPF

    61.8400

    61.84

    +100%

  • CMSC

    0.0200

    24.57

    +0.08%

  • RYCEF

    0.0400

    6.82

    +0.59%

  • RELX

    -1.5000

    44.45

    -3.37%

  • JRI

    0.0235

    13.1

    +0.18%

  • VOD

    0.0900

    8.77

    +1.03%

  • BP

    -0.0700

    28.98

    -0.24%

Artists use tech weapons against AI copycats
Artists use tech weapons against AI copycats / Photo: © AFP

Artists use tech weapons against AI copycats

Artists under siege by artificial intelligence (AI) that studies their work, then replicates their styles, have teamed with university researchers to stymy such copycat activity.

Text size:

US illustrator Paloma McClain went into defense mode after learning that several AI models had been "trained" using her art, with no credit or compensation sent her way.

"It bothered me," McClain told AFP.

"I believe truly meaningful technological advancement is done ethically and elevates all people instead of functioning at the expense of others."

The artist turned to free software called Glaze created by researchers at the University of Chicago.

Glaze essentially outthinks AI models when it comes to how they train, tweaking pixels in ways indiscernible by human viewers but which make a digitized piece of art appear dramatically different to AI.

"We're basically providing technical tools to help protect human creators against invasive and abusive AI models," said professor of computer science Ben Zhao of the Glaze team.

Created in just four months, Glaze spun off technology used to disrupt facial recognition systems.

"We were working at super-fast speed because we knew the problem was serious," Zhao said of rushing to defend artists from software imitators.

"A lot of people were in pain."

Generative AI giants have agreements to use data for training in some cases, but the majority if digital images, audio, and text used to shape the way supersmart software thinks has been scraped from the internet without explicit consent.

Since its release in March of 2023, Glaze has been downloaded more than 1.6 million times, according to Zhao.

Zhao's team is working on a Glaze enhancement called Nightshade that notches up defenses by confusing AI, say by getting it to interpret a dog as a cat.

"I believe Nightshade will have a noticeable effect if enough artists use it and put enough poisoned images into the wild," McClain said, meaning easily available online.

"According to Nightshade's research, it wouldn't take as many poisoned images as one might think."

Zhao's team has been approached by several companies that want to use Nightshade, according to the Chicago academic.

"The goal is for people to be able to protect their content, whether it's individual artists or companies with a lot of intellectual property," said Zhao.

- Viva Voce -

Startup Spawning has developed Kudurru software that detects attempts to harvest large numbers of images from an online venue.

An artist can then block access or send images that don't match what is being requested, tainting the pool of data being used to teach AI what is what, according to Spawning cofounder Jordan Meyer.

More than a thousand websites have already been integrated into the Kudurru network.

Spawning has also launched haveibeentrained.com, a website that features an online tool for finding out whether digitized works have been fed into an AI model and allow artists to opt out of such use in the future.

As defenses ramp up for images, researchers at Washington University in Missouri have developed AntiFake software to thwart AI copying voices.

AntiFake enriches digital recordings of people speaking, adding noises inaudible to people but which make it "impossible to synthesize a human voice," said Zhiyuan Yu, the PhD student behind the project.

The program aims to go beyond just stopping unauthorized training of AI to preventing creation of "deepfakes" -- bogus soundtracks or videos of celebrities, politicians, relatives, or others showing them doing or saying something they didn't.

A popular podcast recently reached out to the AntiFake team for help stopping its productions from being hijacked, according to Zhiyuan Yu.

The freely available software has so far been used for recordings of people speaking, but could also be applied to songs, the researcher said.

"The best solution would be a world in which all data used for AI is subject to consent and payment," Meyer contended.

"We hope to push developers in this direction."

S.Jordan--TFWP