Close Menu
  • AI
  • Content Creation
  • Tech
  • Robotics
AI-trends.todayAI-trends.today
  • AI
  • Content Creation
  • Tech
  • Robotics
Trending
  • 5 Reasons to Think Twice Before Using ChatGPT—or Any Chatbot—for Financial Advice
  • OpenAI Releases GPT-5.5, a Absolutely Retrained Agentic Mannequin That Scores 82.7% on Terminal-Bench 2.0 and 84.9% on GDPval
  • Your Favorite AI Gay Thirst Traps: The Men Behind them
  • Mend Releases AI Safety Governance Framework: Masking Asset Stock, Danger Tiering, AI Provide Chain Safety, and Maturity Mannequin
  • Google DeepMind Introduces Decoupled DiLoCo: An Asynchronous Coaching Structure Attaining 88% Goodput Below Excessive {Hardware} Failure Charges
  • Mend.io releases AI Security Governance Framework covering asset inventory, risk tiering, AI Supply Chain Security and Maturity model
  • Stanford Students Wait in Line to Hear From Silicon Valley Royalty at ‘AI Coachella’
  • Google Cloud AI Research introduces ReasoningBank: a memory framework that distills reasoning strategies from agent successes and failures.
AI-trends.todayAI-trends.today
Home»Tech»Google AI: From Gemma to FunctionGemma. How Google AI Created a Compact Function-Calling Specialist for Edge Tasks

Google AI: From Gemma to FunctionGemma. How Google AI Created a Compact Function-Calling Specialist for Edge Tasks

Tech By Gavin Wallace26/12/20256 Mins Read
Facebook Twitter LinkedIn Email
Mistral Launches Agents API: A New Platform for Developer-Friendly AI
Mistral Launches Agents API: A New Platform for Developer-Friendly AI
Share
Facebook Twitter LinkedIn Email

Google’s FunctionGemma is a special version of Gemma 3 model, trained for calling functions. This edge agent maps natural language actions to APIs.

FunctionGemma is a new software that allows you to create a function.

FunctionGemma, a text-only transformer with 270M parameters based on Gemma 3’s 270M. The model is open under Gemma’s license and has the same structure as Gemma 3. However, the chat and training objectives are geared towards function calls rather than free-form dialogue.

Model is designed to be tuned specifically for function-calling tasks. This is not meant to serve as a general-purpose chat assistant. Its primary goal is translating user instructions and tools definitions into structured functions calls. Then, if desired, summarizing tool responses to the user.

FunctionGemma can be viewed as an ordinary causal language model. The inputs and outputs of FunctionGemma are both text sequences. There is an input budget up to 32K characters per request and a context input of 32K.

Architectural and Training Data

The model is based on the Gemma 3 architecture, and uses the same Gemma 3 parameter scale. The stack for training and running is based on the infrastructure and research used in Gemini. This includes JAX and ML Pathways, which were run by large TPU clusters.

FunctionGemma is based on Gemma’s vocabulary of 256K words, which has been optimized for JSON text and structures. The token-efficient function schemas are optimized for tool responses, and the sequence length is reduced for edge deployments with limited memory.

Model is trained using 6T tokens with an August 2024 knowledge cutoff. This dataset is divided into two categories.

  • Public tool and API Definitions
  • Tool use interactions include function calls, prompts and responses, as well as natural language messages to summarize or clarify outputs.

This signal is used to teach both syntax – which function you should call, how to format your arguments – and intent – when and why to ask for further information and call a particular function.

Control tokens and conversation formats

FunctionGemma uses a strict chat template. The conversation is expected to follow a template which separates the roles from tool regions. Conversations are concluded with Role... Where roles typically occur Developer, User The following are some examples of how to use The model is a.

FunctionGemma uses a set of tokens pairs to control the game during those turns

  • The following are some examples of how to get started: Definitions of tools
  • The following are some examples of how to get started: For the tool’s call
  • The following are some examples of how to get started: For serialized outputs

This allows the model to distinguish between text in natural language and execution results. The hugging face apply_chat_template The official Gemma templates and API generates this structure for message and tool list automatically.

The Mobile Actions Performance can be tuned.

FunctionGemma comes pre-trained for the use of generic tools. The official Mobile Actions Guide and model card stress that the small models will only reach production-level reliability after fine tuning.

Mobile Actions uses a dataset that exposes small sets of Android tools, such as creating a contact or setting a calendar. It also controls the flashlight, and displays maps. FunctionGemma learns to map utterances such as ‘Create a calendar event for lunch tomorrow’ or ‘Turn on the flashlight’ to those tools with structured arguments.

The base FunctionGemma system reaches an accuracy of 58 per cent on the Mobile Actions test. The accuracy of the model increases by 85 percent after fine-tuning with the recipe from a public cookbook.

Edge Agents and Reference Demos

FunctionGemma’s primary deployment targets are edge agents, which run on local devices such as phones, laptops, and accelerators like NVIDIA Jetson nano. Quantization and the small parameter count of 0.3B allow for inference on consumer hardware with little memory.

Google AI Edge Gallery offers a number of reference experiences

  • Mobile Actions This video shows how to create a device assistant that is fully offline, using FunctionGemma. It was fine-tuned on the Mobile Actions dataset before being deployed.
  • Tiny Garden It is a game that can be controlled by voice. The model will decompose commands like “Plant sunflowers in the top row and water them” Domain-specific functions such as plant_seed The following are some examples of how to get started: water_plots Grid coordinates are explicit.
  • FunctionGemma Physics Playground Transformers.js is used to run the game entirely on the web. It lets the user solve the physics puzzles using natural language instructions, which the model then converts into simulation action.

The demos demonstrate that, with the right fine-tuning and interfaces for tools, a function caller supporting 270M parameters can be used to support multiple step logic without server calls.

What you need to know

  1. FunctionGemma, a text-only variant with 270M parameters of Gemma 3, is specifically trained for function calls, rather than open ended chat. It is available as an open version under Gemma’s terms of usage.
  2. This model retains the Gemma 3 Transformer architecture with a 256k token dictionary, but supports 32k tokens for each request, shared between inputs and outputs, and has been trained using 6T tokens.
  3. FunctionGemma is a rigid chat template. Role... It is necessary to use reliable tools in production systems that have control tokens dedicated for functions declarations, calls, and responses.
  4. Mobile Actions Benchmark: Accuracy improves by 58 percent from the base model up to 85 percent with task-specific fine tuning. This shows that smaller function callers require domain knowledge more than quick engineering.
  5. FunctionGemma can run on Jetson devices as well as phones and laptops. The model has been integrated with ecosystems like Hugging face, Vertex AI and LM Studio, along with edge demos Mobile Actions Tiny Garden, Physics Playground and Mobile Actions.

Take a look at the Technical details The following are some examples of how to get started: Model on HF. Also, feel free to follow us on Twitter Join our Facebook group! 100k+ ML SubReddit Subscribe now our Newsletter. Wait! What? now you can join us on telegram as well.


Asif Razzaq, CEO of Marktechpost Media Inc. is a visionary engineer and entrepreneur who is dedicated to harnessing Artificial Intelligence’s potential for the social good. Marktechpost was his most recent venture. This platform, devoted to Artificial Intelligence, is notable for its technical and accessible coverage of news on machine learning and deep-learning. This platform has over 2,000,000 monthly views which shows its popularity.

AI cia Google
Share. Facebook Twitter LinkedIn Email
Avatar
Gavin Wallace

Related Posts

OpenAI Releases GPT-5.5, a Absolutely Retrained Agentic Mannequin That Scores 82.7% on Terminal-Bench 2.0 and 84.9% on GDPval

24/04/2026

Mend Releases AI Safety Governance Framework: Masking Asset Stock, Danger Tiering, AI Provide Chain Safety, and Maturity Mannequin

24/04/2026

Google DeepMind Introduces Decoupled DiLoCo: An Asynchronous Coaching Structure Attaining 88% Goodput Below Excessive {Hardware} Failure Charges

24/04/2026

Mend.io releases AI Security Governance Framework covering asset inventory, risk tiering, AI Supply Chain Security and Maturity model

23/04/2026
Top News

America’s largest bitcoin miners are shifting to AI

Gear News of the week: Another AI Browser and Fujifilm’s X-T30 III debut.

Chatbots Use Your Emotions To Avoid Saying Goodbye

Do Large Language Models (LLMs), or just good at simulating intelligence, represent real AI? • AI Blog

Apple’s Subscription Business Is the Legacy of Tim Cook

Load More
AI-Trends.Today

Your daily source of AI news and trends. Stay up to date with everything AI and automation!

X (Twitter) Instagram
Top Insights

Wukong: the AI Chatbot China Installed in its Space Station

21/08/2025

OpenAI signs $38 billion deal with Amazon

03/11/2025
Latest News

5 Reasons to Think Twice Before Using ChatGPT—or Any Chatbot—for Financial Advice

24/04/2026

OpenAI Releases GPT-5.5, a Absolutely Retrained Agentic Mannequin That Scores 82.7% on Terminal-Bench 2.0 and 84.9% on GDPval

24/04/2026
X (Twitter) Instagram
  • Privacy Policy
  • Contact Us
  • Terms and Conditions
© 2026 AI-Trends.Today

Type above and press Enter to search. Press Esc to cancel.