Computing vaccine

July 2019
image by Mike MacKenzie at vpnvirus

Australian researchers have developed a way to ‘vaccinate’ computer algorithms from adversarial attacks.

Algorithms perform a core function in the rapidly growing field of artificial intelligence (AI) and machine learning in that they can analyse data and ‘learn’ to use it to perform a given task without the need for specific instructions.

They can be designed to, for example, filter out spam emails, or make powerful predictions such as suggesting a movie based on what you have previously watched.

But while this is extremely useful, algorithms are also vulnerable targets for malicious data inputs that can cause the machine learning models they are designed for to malfunction.

For example, a program may misclassify an image when it is masked by an added layer of noise.

According to Dr Richard Nock, who leads the machine learning group at CSIRO’s Data61, adversarial attacks can also trick a machine learning model into incorrectly labelling a traffic stop sign as a speed sign, with potentially disastrous effects.

The group has led the development of a technique that effectively prevents such attacks through a process that is akin to vaccination.

This is done by implementing weak versions of an adversary, such as small modifications or distortions in a collection of images. Algorithms exposed to these ‘small doses’ of distortions create a machine learning model that is more robust and effectively 'immune' to potential adversarial attacks.

Artificial intelligence is one of the main target areas of CSIRO’s research investment. Recently, the organisation invested $19 million into an Artificial Intelligence and Machine Learning Future Science Platform that will drive AI-based solutions in a range of areas of national importance, such as food security and quality, health and wellbeing, and sustainable energy and resources.

The CSIRO also had a leading role in an AI ethics framework for Australia, released by the Australian Government for public consultation in April 2019.

More information