Friday Wrap : Volume Four

Living brain computers, North Korean spies and dodgy AI datasets aplenty in this Friday Wrap.

Friday Wrap : Volume Four
Photo by Yura Batiushyn / Unsplash

Welcome back for another weekly round up of things that I saw online and thought interesting enough to share. I hope you agree.

Computer says "you're dead"

Journalist Dave Barry was somewhat surprised when he Googled his own name and the AI response told him he was dead. It apparently had confused some of his details (photo, awards) with details of another Dave Barry (bio, deceased status). After contacting Google to get them to correct the details, it duly reported his correct biography but still retained the incorrect detail of his demise.

AI: The Movie

As Netflix confirms that it has used AI to generate special effects for one of its upcoming productions, this article takes a closer look at the use of AI in movie-making, focusing on a twelve minute short film made entirely with AI tools (predominantly Google's Veo3)

Actual Living Brain Computer

As creepy and unbelievable sounding as it is, researchers have developed computers which use actual, living, human brain cells as part of their processors. So-called “organoid intelligence” uses clusters of stem cell-derived neurons connected to a more traditional silicon-based processor. You can have a go with it yourself if you're prepared to pay $1000 a month for the subscription to their cloud service.

Equal Pay AI-Style

New research has found that large language models (LLMs) such as ChatGPT consistently advise women to ask for lower salaries than men, even when both have identical qualifications. This seems to be another example of AI reflecting the biases in the training data it is given.

Personal Data in AI Training Set

Also related to AI training data, researchers have found the widespread, unconstrained web crawlers that gather data to feed AI means that personally identifiable information on individuals has made its way into the models. Images of passports, credit cards, birth certificates, and other sensitive documents were found in DataComp CommonPool, a major AI training set for image generation scraped from the web.

Mo' Tech, Mo' Troubles

In this article, science writer Mark Buchanan discusses why technology developed to solve one problem often ends up creating others. He suggests that we always model a problem in an incomplete way and only realise late in the process that our inadequate model has allowed unforeseen consequences to arise.

North Korea-aiding American Imprisoned

A fifty-year old Arizona woman, who pleaded guilty in February to helping North Korean agents find work at US tech companies, has been sentenced to eight and a half years in jail. Christina Chapman ran a "laptop farm" of fifty devices, along with stolen US identities, in her home which North Korean agents were able to remotely access to work for major US companies. This kind of remote working scam is increasingly used by Kim Jong-Un's secretive regime to infiltrate businesses and leverage their access for financial fraud.

Silent Translation by Nancy Qunqar

A weird bug that's been present in OpenAI's Whisper speech-to-text engine since at least February 2024 means that silence is getting transcribed as a credit to a translator (in Arabic it's Nancy Qunqar, in Norwegian Nicolai Winther). It's another one of those "garbage in, garbage out" training set issues. The model is often trained on videos found online by pulling in the audio and subtitles files. Foreign subtitle translators often use a period of silence in the audio as a useful point to add their credit and the AI model learns to associate silence with that textual representation.

The AI-Assisted Government

The UK Government is currently in thrall to the AI industry, claiming that it has the answer to improving efficiency in all manner of government departments. The latest announcement is the highly dubious claim that AI facial recognition software can be used to spot asylum seekers who are lying about their age (to qualify for the special provisions given to children seeking asylum) or who have been misclassified as adults when they are really children. Given that a person looks identical when they're seventeen years and 364 days old as they do a couple of days later when they're eighteen, it's seems this idea is well and truly in cloud cuckoo land.

Mastodon