AI can better retain what it learns by mimicking human sleep

[…]

Concetto Spampinato and his colleagues at the University of Catania, Italy, were looking for ways to avoid a phenomenon known as “catastrophic forgetting”, where an AI model trained to do a new task loses the ability to carry out jobs it previously aced. For instance, a model trained to identify animals could learn to spot different fish species, but then it might inadvertently lose its proficiency at recognising birds.

They developed a new method of training AI called wake-sleep consolidated learning (WSCL), which mimics the way human brains reinforce new information. People shuffle short-term memories of experiences and lessons learned throughout the day into long-term memories while sleeping. The researchers say this method of learning can be applied to any existing AI.

Models using WSCL are trained as usual on a set of data for the “awake” phase. But they are also programmed to have periods of “sleeping”, where they parse through a sample of awake data, as well as a highlight reel from previous lessons.

Take an animal identification model more recently trained on images of marine life: during a sleep period, it would be shown snapshots of fishes, but also a smattering of birds, lions and elephants from older lessons. Spampinato says this is akin to humans mulling over new and old memories while sleeping, spotting connections and patterns and integrating them into our minds. The new data teaches the AI a fresh ability, while the remainder of the old data prevents the recently acquired skill from pushing out existing ones.

Crucially, WSCL also has a period of “dreaming”, when it consumes entirely novel data made from mashing together previous concepts. For instance, the animal model might be fed abstract images showing combinations of giraffes crossed with fish, or lions crossed with elephants. Spampinato says this phase helps to merge previous paths of digital “neurons”, freeing up space for other concepts in the future. It also primes unused neurons with patterns that will help them pick up new lessons more easily.

[…]

Spampinato tested three existing AI models using a traditional training method, followed by WSCL training. Then he and his team compared the performances using three standard benchmarks for image identification. The researchers found their newly developed technique led to a significant accuracy boost – the sleep-trained models were 2 to 12 per cent more likely to correctly identify the contents of an image. They also measured an increase in the WSCL systems’ “forward transfer”, a metric indicating how much old knowledge a model uses to learn a new task. The research indicated AI trained with the sleep method remembered old tasks better than the traditionally trained systems.

[…]

Source: AI can better retain what it learns by mimicking human sleep | New Scientist

Robin Edgar

Organisational Structures | Technology and Science | Military, IT and Lifestyle consultancy | Social, Broadcast & Cross Media | Flying aircraft

 robin@edgarbv.com  https://www.edgarbv.com