Close Menu
Nabka News
  • Home
  • News
  • Business
  • China
  • India
  • Pakistan
  • Political
  • Tech
  • Trend
  • USA
  • Sports

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

China doubles down on “investing in people” for demand-led growth-Xinhua

March 20, 2026

Ex-CGS downplays Gabbard remarks

March 20, 2026

Who’s most optimistic about AI — and who isn’t

March 20, 2026
Facebook X (Twitter) Instagram
  • Home
  • About NabkaNews
  • Advertise with NabkaNews
  • DMCA Policy
  • Privacy Policy
  • Terms of Use
  • Contact us
Facebook X (Twitter) Instagram Pinterest Vimeo
Nabka News
  • Home
  • News
  • Business
  • China
  • India
  • Pakistan
  • Political
  • Tech
  • Trend
  • USA
  • Sports
Nabka News
Home » Google claims new AI training technique is 13x faster and 10x more power efficient — DeepMind’s new JEST optimizes training data for huge gains
Tech

Google claims new AI training technique is 13x faster and 10x more power efficient — DeepMind’s new JEST optimizes training data for huge gains

i2wtcBy i2wtcJuly 7, 2024No Comments3 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Email WhatsApp Copy Link
Follow Us
Google News Flipboard Threads
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link


Google’s AI research institute, Google DeepMind, has published new research into training AI models that it claims will accelerate both training speed and energy efficiency by an order of magnitude, delivering 13x performance and 10x power efficiency over alternative methods. The new JEST training method comes at a time when debate is heating up about the environmental impact of AI data centers.

DeepMind’s technique, called JEST (Joint Example Selection), stands out from traditional AI model training techniques. While typical training techniques focus on individual data points for training and learning, JEST trains based on entire batches. The JEST technique first creates a small AI model that evaluates data quality from a very high-quality source and ranks batches by quality. It then compares the results to a larger, lower-quality set. The small JEST model determines which batches are best for training, and then trains a larger model based on the findings of the small model.

The paper itself, available here , provides a more detailed explanation of the process used in the study and the future of the research.

In their paper, DeepMind researchers make it clear that this “ability to direct the data selection process towards a distribution of smaller, well-curated datasets” is crucial to the success of the JEST method. Success is a good word for this research: DeepMind claims that “our approach outperforms state-of-the-art models by requiring up to 13x fewer iterations and 10x fewer computations.”

A chart showing the efficiency and speed improvements compared to traditional AI training methods.

The graph above shows that the JEST method outperforms SigLIP (the state-of-the-art method for training models on image-caption pairs) in terms of speed and FLOPS efficiency, and compares favorably with many other methods. (Images courtesy of Google DeepMind, Evans et al.)

Of course, the system is entirely dependent on the quality of the training data, and bootstrapping techniques will only work if you have a top-quality, human-curated data set. Nowhere is the adage “garbage in, garbage out” more applicable than with this method of “jumping ahead” in the training process. This makes the JEST method much more difficult for hobbyists and amateur AI developers to achieve than most other methods, as it will likely require expert-level research skills to curate the best initial training data.

JEST’s research comes at a time when the tech industry and governments around the world are beginning to discuss artificial intelligence’s massive power demands. AI workloads will consume about 4.3 GW in 2023, roughly the annual electricity consumption of the Republic of Cyprus. And things certainly aren’t slowing down: a single ChatGPT request costs 10 times more power than a Google search, and Arm’s CEO predicts that AI will take over a quarter of the U.S. power grid by 2030.

It remains to be seen if and how the JEST method will be adopted by big players in the AI ​​field. With GPT-4o reportedly costing $100 million to train, and future larger models likely to reach the $1 billion mark soon, companies may be looking for ways to save their wallets in this sector. Hopefuls believe that the JEST method will be used to maintain current training productivity with significantly lower power consumption, which will reduce the cost of AI and help the planet. However, it is far more likely that capital machines will keep their foot on the gas and use the JEST method to keep power consumption at its maximum for lightning-fast training output. Which will win: cost reduction or output scale?

Get the best Tom’s Hardware news and in-depth reviews straight to your inbox.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email WhatsApp Copy Link
i2wtc
  • Website

Related Posts

Tech

Who’s most optimistic about AI — and who isn’t

March 20, 2026
Tech

Nvidia’s Huang pitches AI tokens on top of salary as agents reshape how humans work

March 20, 2026
Tech

Alibaba workforce shrinks 34% in 2025 as Chinese tech giant doubles down on AI

March 20, 2026
Tech

Amazon acquires startup Rivr to test robots for ‘doorstep delivery’

March 20, 2026
Tech

US charges Super Micro employees with smuggling Nvidia chips to China

March 20, 2026
Tech

Apollo’s Sambur says software’s AI troubles will persist

March 19, 2026
Add A Comment
Leave A Reply Cancel Reply

Top Posts

House Republicans unveil aid bill for Israel, Ukraine ahead of weekend House vote

April 17, 2024

Prime Minister Johnson presses forward with Ukraine aid bill despite pressure from hardliners

April 17, 2024

Justin Verlander makes season debut against Nationals

April 17, 2024

Tesla lays off 285 employees in Buffalo, New York as part of major restructuring

April 17, 2024
Don't Miss

Trump says China’s Xi ‘hard to make a deal with’ amid trade dispute | Donald Trump News

By i2wtcJune 4, 20250

Growing strains in US-China relations over implementation of agreement to roll back tariffs and trade…

Donald Trump’s 50% steel and aluminium tariffs take effect | Business and Economy News

June 4, 2025

The Take: Why is Trump cracking down on Chinese students? | Education News

June 4, 2025

Chinese couple charged with smuggling toxic fungus into US | Science and Technology News

June 4, 2025

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

About Us
About Us

Welcome to NabkaNews, your go-to source for the latest updates and insights on technology, business, and news from around the world, with a focus on the USA, Pakistan, and India.

At NabkaNews, we understand the importance of staying informed in today’s fast-paced world. Our mission is to provide you with accurate, relevant, and engaging content that keeps you up-to-date with the latest developments in technology, business trends, and news events.

Facebook X (Twitter) Pinterest YouTube WhatsApp
Our Picks

China doubles down on “investing in people” for demand-led growth-Xinhua

March 20, 2026

Ex-CGS downplays Gabbard remarks

March 20, 2026

Who’s most optimistic about AI — and who isn’t

March 20, 2026
Most Popular

Chengdu 2025 World Games medal design unveiled-Xinhua

June 19, 2025

AIIB’s first decade marks a path of multilateral, sustainable development-Xinhua

June 24, 2025

China’s aircraft carrier formations return after completing far-sea training-Xinhua

July 1, 2025
© 2026 nabkanews. Designed by nabkanews.
  • Home
  • About NabkaNews
  • Advertise with NabkaNews
  • DMCA Policy
  • Privacy Policy
  • Terms of Use
  • Contact us

Type above and press Enter to search. Press Esc to cancel.