Facebook has trained an AI to treat irrelevant data like spoiled milk

[ad_1]

Computers are just too good at remembering all the stuff we teach them. Normally, that’s fine; you wouldn’t want the systems that maintain your medical or financial records to start randomly dropping 1s and 0s (OK, well maybe the one that tracks your credit card debt, but other than that). However, these systems generally do not discriminate between information sources, meaning every bit of data processed with equal vigor. But as the amount of information available increases, AI systems must expend more and more finite computing resources to handle it. Facebook researchers hope to help future AIs pay better attention by .

It’s called Expire-Span and it is designed to help neural networks more efficiently sort and store the information most pertinent to their assigned tasks. Expire-Span works by first predicting which information will be most useful to the network in the given context and then assigns an expiration date onto that piece of data. The more important Expire-Span thinks a piece of information is, the farther out it assigns the expiration date, Angela Fan and Sainbayar Sukhbaatar, research scientists at FAIR, explained in a . Thus, neural networks will be able to retain pertinent information longer while continually clearing memory space by “forgetting” irrelevant data points. Every time a new piece of data is added, the system will evaluate not only its relative importance but also reevaluate the importance of existing data points related to it. This will also help AI learn to use the memory available more effectively, which leads to better scalability.

The act of forgetting, for AIs at least, can be a challenge in that doing so is a . Like the 1s and 0s that make up the AI’s code, the system can either remember a piece of information or not. As such optimizing for a binary system like that is uncannily difficult. Previous attempts to get around that difficulty involved compressing the less useful data so that it would take up less space in memory but those efforts came up short as the compression process results in “blurry versions” of the information, according to Fan and Sukhbaatar.

“Expire-Span calculates the information’s expiration value for each hidden state, each time a new piece of information is presented, and determines how long that information is preserved as a memory,” they explained. “This gradual decay of some information is key to keeping important information without blurring it. And the learnable mechanism allows the model to adjust the span size as needed. Expire-Span calculates a prediction based on context learned from data and influenced by its surrounding memories.”

Though still in the early stages of research, “As a next step in our research toward more humanlike AI systems, we’re studying how to incorporate different types of memories into neural networks,” the research team wrote. In the future, the team hopes to develop an even closer approximation of human memory but capable at learning new information far faster than current technology allows.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.

[ad_2]

Source link

We will be happy to hear your thoughts

Leave a reply

Household Attire
Logo