Meta AI and Samsung researchers introduce two new AI methods, Prodigy and Resetting, for learning rate adaptation that improve the adaptation rate of the state-of-the-art D-Adaptation method

Modern machine learning relies heavily on optimization to provide effective answers to complex problems in areas as diverse as computer vision, natural language processing, and reinforcement learning. The difficulty of achieving rapid convergence and high quality solutions largely depends on the learning rates chosen. Applications with many agents, each with its own optimizer, made it more difficult to optimize for learning speed. Some hand-tuned optimizers work well, but these methods typically require expert skill and labor intensive work. Thus, in recent years, parameterless adaptive learning rate methods, such as the D-Adaptation approach, have gained popularity for non-learning rate optimization.

The research team of Samsung AI Center and Meta AI introduces two unique modifications to the D-Adaptation method called Prodigy and Resetting to improve the worst-case non-asymptotic convergence rate of the D-Adaptation method, leading to faster convergence rates and a better output optimization.

The authors introduce two new modifications to the original method to improve the worst-case non-asymptotic convergence rate of D-Adaptation methods. They improve algorithm convergence speed and solution quality performance by modifying the adaptive learning rate method. A lower bound is established for any approach that adjusts for distance from the solution constant D to test the proposed adjustments. They also show that compared to other methods with exponentially limited iteration growth, the advanced approaches are worst-case optimal up to constant factors. Extensive testing is then conducted to demonstrate that augmented D-Adaptation methods rapidly adjust the learning rate, resulting in higher convergence rates and optimization results.

Check out 100s AI Tools in our AI Tools Club

The team’s innovative strategy involves modifying the error term D-Adaptations with step sizes similar to Adagrad. Researchers can now take bigger steps with confidence while keeping the principal error term intact, allowing the improved method to converge more quickly. The algorithm slows down when the denominator in the step size gets too large. Then they add extra weight next to the gradients just in case.

The researchers used the proposed techniques to solve convex logistic regression and severe learning challenges in their empirical investigation. In multiple studies, Prodigy has demonstrated faster adoption than any other known approach; D-Adaptation with Reset achieves the same theoretical rate as Prodigy while using much simpler theory than Prodigy or D-Adaptation. Furthermore, the proposed methods often outperform the D-Adaptation algorithm and can achieve test accuracy on par with manually tuned Adam.

Two recently proposed methods have surpassed the state-of-the-art D-adaption approach of learning rate adaptation. Extensive experimental evidence shows that Prodigy, a weighted D-Adaptation variant, is more adaptive than existing approaches. The second method, D-Adaptation with resetting, is shown to be able to match Prodigy’s theoretical pace with a much less complex theory.


Check out ThePaper.Don’t forget to subscribeour 25k+ ML SubReddit,Discord channel,ANDEmail newsletterwhere we share the latest news on AI research, cool AI projects, and more. If you have any questions regarding the above article or if you have missed anything, please do not hesitate to email us atAsif@marktechpost.com

Check out 100s AI Tools in the AI ​​Tools Club


Dhanshree Shenwai is a software engineer and has good experience in FinTech companies covering Finance, Cards & Payments and Banking with keen interest in AI applications. He is enthusiastic about exploring new technologies and advancements in today’s changing world, making everyone’s life easier.

Unleash the power of Live Proxies: private, undetectable residential and mobile IPs.

#Meta #Samsung #researchers #introduce #methods #Prodigy #Resetting #learning #rate #adaptation #improve #adaptation #rate #stateoftheart #DAdaptation #method
Image Source : www.marktechpost.com

Leave a Comment