• Topics
  • Artificial wisdom
  • Autopilot
  • network
  • Processor
  • 手機
  • exhibition activities
    • CES
      • CES 2014
      • CES 2015
      • CES 2016
      • CES 2017
      • CES 2018
      • CES 2019
      • CES 2020
    • MWC
      • MWC 2014
      • MWC 2015
      • MWC 2016
      • MWC 2017
      • MWC 2018
      • MWC 2019
    • Computex
      • Computex 2014
      • Computex 2015
      • Computex 2016
      • Computex 2017
      • Computex 2018
      • Computex 2019
    • E3
      • E3 2014
      • E3 2015
      • E3 2016
      • E3 2017
    • IFA
      • IFA 2014
      • IFA 2015
      • IFA 2016
      • IFA 2017
    • TGS
      • TGS 2016
  • About_us
    • About mashdigi
    • mashdigi website contact details
2026 / 02 / 09 09:52 Monday
  • Login
mashdigi-Technology, new products, interesting news, trends
  • Topics
  • Artificial wisdom
  • Autopilot
  • network
  • Processor
  • 手機
  • exhibition activities
    • CES
      • CES 2014
      • CES 2015
      • CES 2016
      • CES 2017
      • CES 2018
      • CES 2019
      • CES 2020
    • MWC
      • MWC 2014
      • MWC 2015
      • MWC 2016
      • MWC 2017
      • MWC 2018
      • MWC 2019
    • Computex
      • Computex 2014
      • Computex 2015
      • Computex 2016
      • Computex 2017
      • Computex 2018
      • Computex 2019
    • E3
      • E3 2014
      • E3 2015
      • E3 2016
      • E3 2017
    • IFA
      • IFA 2014
      • IFA 2015
      • IFA 2016
      • IFA 2017
    • TGS
      • TGS 2016
  • About_us
    • About mashdigi
    • mashdigi website contact details
No Result
View All Result
  • Topics
  • Artificial wisdom
  • Autopilot
  • network
  • Processor
  • 手機
  • exhibition activities
    • CES
      • CES 2014
      • CES 2015
      • CES 2016
      • CES 2017
      • CES 2018
      • CES 2019
      • CES 2020
    • MWC
      • MWC 2014
      • MWC 2015
      • MWC 2016
      • MWC 2017
      • MWC 2018
      • MWC 2019
    • Computex
      • Computex 2014
      • Computex 2015
      • Computex 2016
      • Computex 2017
      • Computex 2018
      • Computex 2019
    • E3
      • E3 2014
      • E3 2015
      • E3 2016
      • E3 2017
    • IFA
      • IFA 2014
      • IFA 2015
      • IFA 2016
      • IFA 2017
    • TGS
      • TGS 2016
  • About_us
    • About mashdigi
    • mashdigi website contact details
No Result
View All Result
mashdigi-Technology, new products, interesting news, trends
No Result
View All Result
Home Market dynamics

Google officially opens Gemma 270, an open-source model with 2 billion parameters, to developers and researchers.
A smaller version with 26 billion parameters will also be available, which can be executed on mobile phones.

Author: Mash Yang
2024-06-28
in Market dynamics, Life, network, software, Topics
A A
0
Share to FacebookShare on TwitterShare to LINE

Google announced the launch ofNew open source model Gemma 2It is optimized for TPU and GPU acceleration, and can output twice the model running performance, and can correspond to up to 2 billion sets of parameters. However, it also provides a small-scale version that can correspond to 270 billion sets of parameters. In the future, an even smaller-scale version with 90 billion sets of parameters will also be provided, which will be executable on mobile phones.

Google officially opens Gemma 270, an open-source model with 2 billion parameters, to developers and researchers.

In the earlier descriptionGemma 2 can be obtained through Kaggle, a data modeling and data analysis competition platform, or through the free service of Colab, a web programming platform called Colaboratory. Academic researchers can also apply for its use through research projects.

In related simulation testing, the 2 billion parameter version of Gemma 270 surpassed the 700 billion parameter Llama 3 in fine-tuning mode, surpassing the 3400 billion parameter Nemotron 4, as well as models such as Claude 3 Sonnet, Command R+, and Qwen 72B. The 90 billion parameter version even became the best-performing model under 150 billion parameters.

Google officially opens Gemma 270, an open-source model with 2 billion parameters, to developers and researchers.
▲Gemma 2 performance compared to other models

According to the documentation, the 90-billion-parameter version of Gemma 2 was trained on a cluster of 4096 TPU v4s, while the 270-billion-parameter version was trained on a cluster of TPU v5ps, using a total of 6144 chips. Gemma 2's overall architecture is redesigned, using a similar computational model to Gemma 1.1. However, with increased learning supervision and model merging, Gemma 2 offers significant improvements over Gemma 1.1 in programming, mathematics, reasoning, and security.

In addition, the 2 billion parameter scale version of Gemma 270 can perform full-precision inference with high efficiency on Google Cloud TPU servers, NVIDIA A100 80GB Tensor Core GPUs, or H100 Tensor Core GPUs, maintaining high-performance computing while reducing operating costs, allowing enterprises and developers to execute and deploy artificial intelligence services in a more economical way.

Google also emphasized the responsible creation of Gemma 2, explaining the application of Gemma 2's security features and following internal security processes to filter pre-training data to avoid potential risks such as bias.

Google officially opens Gemma 270, an open-source model with 2 billion parameters, to developers and researchers.
▲Gemma 2's safety performance
Tags: AIGemGemma 2GoogleGoogle CloudTPUArtificial wisdom
ShareTweetShare
Mash Yang

Mash Yang

Founder and editor of mashdigi.com, and student of technology journalism.

Leave a Reply Cancel Reply

The email address that must be filled in to post a message will not be made public. Required fields are marked as *

This site uses Akismet service to reduce spam.Learn more about how Akismet processes website visitor comments.

Translation (Tanslate)

Recent updates:

The New Battleground for Data in the Era of Generative AI: Seagate Explains Why the "Edge" is Key to AI ROI

The New Battleground for Data in the Era of Generative AI: Seagate Explains Why the "Edge" is Key to AI ROI

2026-02-09
Amazon announced plans to build more small modular reactors to generate more electricity for data centers and other facilities.

Is AI manufacturing being built too fast? New York State legislator proposes to freeze data center development for three years, citing concerns about soaring electricity costs and environmental impact.

2026-02-08
Paramount launches a $1080 billion hostile takeover bid for Warner Bros., offering $30 per share in an all-cash deal.

With the $827 billion acquisition of Warner Bros. Discovery under close scrutiny, reports indicate that the U.S. Department of Justice is investigating whether Netflix is ​​involved in monopolistic practices.

2026-02-08
mashdigi-Technology, new products, interesting news, trends

Copyright © 2017 mashdigi.com

  • About mashdigi.com
  • Place ads
  • Contact mashdigi.com

Follow us

Welcome back!

Login to your account below

Forgotten Password?

Retrieve your password

Hãy nhập tên người dùng hoặc địa chỉ email để mở mật khẩu

Log In
No Result
View All Result
  • About mashdigi.com
  • Place ads
  • Contact mashdigi.com

Copyright © 2017 mashdigi.com