Faking and manipulating photos is nothing new, but the rapid emergence of artificial intelligence has set off alarms that the technology used to trick people is advancing far faster than the technology that can identify the tricks, the New York Times reports.
The recent AI image of Pope Francis in a puffer jacket fooled many people. Elsewhere online, a photographer who posted images on Instagram created with AI program Midjourney was accused of deception.
Tech companies, researchers, photo agencies and news organizations are scrambling to establish standards for content provenance and ownership. Ironically, Scientific American reports, humans’ best defense might be yet another AI system: one trained to detect artificial images.
Most jobs will be impacted by generative AI, according to a study by researchers at the University of Pennsylvania and OpenAI, the company that makes the popular AI tool ChatGPT. Among the jobs likely to be most affected are accountants, PR specialists and other writers, blockchain engineers and mathematicians.
Jack Dorsey, the former CEO of Twitter and chief at payments startup Block, has been spending his time lately on another platform: NOSTR, an open-source social media protocol launched in 2020 that has become popular with Bitcoin enthusiasts, Politico’s Digital Future Daily reports. NOSTR is a protocol, like email, that anyone can build software on top of. A few months ago, Dorsey donated 14 Bitcoin—then worth roughly a quarter million dollars—to support the protocol’s development.
Andy Hunter’s e-commerce platform uniting independent bookstores, Bookshop.org, was a pandemic hit. Now he wants to show business owners how to scale up—without needing to kill the competition, Wired reports.
“I think this is extraordinary but I don’t know if it’s beneficial.”
—Warren Buffett, on generative artificial intelligence like ChatGPT