Close Menu
  • AI
  • Content Creation
  • Tech
  • Robotics
AI-trends.todayAI-trends.today
  • AI
  • Content Creation
  • Tech
  • Robotics
Trending
  • Deepgram Python SDK Implementation for Transcription and Async Processing of Audio, Async Text Intelligence, and Async Text Intelligence.
  • DeepSeek AI releases DeepSeek V4: Sparse attention and heavily compressed attention enable one-million-token contexts.
  • AI-Designed drugs by a DeepMind spinoff are headed to human trials
  • Apple’s new CEO must launch an AI killer product
  • OpenMythos Coding Tutorial: Recurrent-Depth Transformers, Depth Extrapolation and Mixture of Experts Routing
  • 5 Reasons to Think Twice Before Using ChatGPT—or Any Chatbot—for Financial Advice
  • OpenAI Releases GPT-5.5, a Absolutely Retrained Agentic Mannequin That Scores 82.7% on Terminal-Bench 2.0 and 84.9% on GDPval
  • Your Favorite AI Gay Thirst Traps: The Men Behind them
AI-trends.todayAI-trends.today
Home»Tech»JSON Prompting: Practical Guide for LLMs with Python Coding Example

JSON Prompting: Practical Guide for LLMs with Python Coding Example

Tech By Gavin Wallace24/08/20258 Mins Read
Facebook Twitter LinkedIn Email
Mistral Launches Agents API: A New Platform for Developer-Friendly AI
Mistral Launches Agents API: A New Platform for Developer-Friendly AI
Share
Facebook Twitter LinkedIn Email

JSON Prompting allows AI models to be given clear and explicit instructions using JavaScript Object Notation. JSON prompts are more precise than traditional text prompts that can be ambiguous and lead to misinterpretation. They organize requests as arrays of key-value pairs and objects nested within each other, transforming vague requests into blueprints the model will follow. This method greatly improves consistency and accuracy—especially for complex or repetitive tasks—by allowing users to specify things like task type, topic, audience, output format, and other parameters in an organized way that language models inherently understand. JSON prompting, which is a strategy that generates sharper results, has been adopted by major LLMs such as GPT-4 and Claude to produce more accurate, predictable input.

We’ll explore JSON prompting in this tutorial and explain why it is so powerful. It can change the way that you interact with AI.

We will walk you through the benefits of using JSON Prompting through coding examples —from simple text prompts to structured JSON prompts—and show you comparisons of their outputs. Structured prompts will help you to build workflows that are more precise, consistent, and scalable. Click here to see the FULL CODES here.

Installing Dependencies

Import os
Getpass Import
os.environ["OPENAI_API_KEY"] = getpass('Enter OpenAI API Key: ')

Visit OpenAI API Key to get your OpenAI API key https://platform.openai.com/settings/organization/api-keys You can generate a key. For new users, billing information may need to be added and a $5 payment is required. You can check out the FULL CODES here.

OpenAI can be imported from OpenAI
OpenAI client()

The Structured Prompts ensure Consistency

Using structured prompts, such as JSON-based formats, forces you to think in terms of fields and values — a true advantage when working with LLMs. See the FULL CODES here.

You eliminate all ambiguity by defining the structure. This ensures that each response will follow a consistent pattern.

This is a very simple example.

After reading this email, summarize it and make a list of the next steps.

Email:
Hello team. Let's finish the marketing plan before Tuesday. Alice prepare the first draft, Bob handle the design.

This prompt will be fed to LLM two different ways. We can then observe how the results differ between a JSON-based structured prompt and a free form prompt. Click here to see the FULL CODES here.

Free-Form Prompt

prompt_text = """
List the actions that you will take after reading the email.

Email:
Hello team. Let's finish the marketing plan before Tuesday. Alice prepare the first draft, Bob handle the design.
"""

response_text = client.chat.completions.create(
    model="gpt-5",
    messages=[{"role": "user", "content": prompt_text}]
)

text_output = response_text.choices[0].message.content
print(text_output)
Summary:
By Tuesday, the team must finalize its marketing plan. Alice will draft the document, while Bob will design it.

Take Action:
Do you have a draft marketing plan?
Bob: Finish the design before Tuesday.
Finalize your marketing plan for Tuesday.

JSON Prompt

prompt_json = """
The output must be strictly JSON.

 high"


Email:
Hello team. Let's finish the marketing plan before Tuesday. Alice prepare the first draft, Bob handle the design.
"""

response_json = client.chat.completions.create(
    model="gpt-5",
    messages=[
        {"role": "system", "content": "You are a precise assistant that always replies in valid JSON."},
        {"role": "user", "content": prompt_json}
    ]
)

json_output = response_json.choices[0].message.content
print(json_output)

{
  "summary": "Finalize the marketing plan by Tuesday; Alice to draft and Bob to handle design.",
  "action_items": [
    "Alice: prepare the draft",
    "Bob: handle the design",
    "Team: finalize the marketing plan by Tuesday"
  ],
  "priority": "medium"
}

The use of structured JSON in this example leads to an output that’s easy to read and understand. By creating fields, such as “summary”, “action_items”” “priority”. The LLM becomes a more comprehensive and consistent response. The model generates predictable text that is free of any ambiguity. Instead, it produces a flow of text which may vary in detail and style. The model not only increases the readability of the responses, but it also allows for easier integration into workflows such as dashboards or automated email handling.

Users can control output

You can remove any ambiguity in the prompt by using JSON. If you pass plain text as the input, your request can produce inconsistent results. However, by structuring the request in JSON — with clearly defined fields like “summary”, “sentiment”, “opportunities”, “risks”” “confidence_score” — the response becomes predictable, machine-friendly, and easier to parse. This consistency ensures that, whether you’re generating content, analyzing reports, or extracting insights, your workflow remains streamlined and reliable, with no surprises — just clean, structured results every time. Visit the FULL CODES here.

Free-Form Prompt

plain_text_prompt = """
Review the market updates below:

Market Text
Tesla's Q2 results exceeded expectations thanks to increased Model Y sales. However, rising BYD competition is a threat.
Apple's revenue grew steadily, driven by iPhone sales. However, services revenues fell slightly.
Amazon AWS is still the dominant cloud computing provider in Europe, despite increasing regulatory scrutiny.

Generate:
Market summary in two lines
The sentiment of each company. (positive,negative,neutral)
Important growth risks and opportunities
Score of confidence from 0-10
"""

response_plain = client.chat.completions.create(
    model="gpt-5",
    messages=[{"role": "user", "content": plain_text_prompt}]
)

plain_output = response_plain.choices[0].message.content
print(plain_output)
Summary of the Market
- Updates on earnings are largely constructive. Tesla beat Q2 expectations with a successful Model Y. Apple grew iPhone revenue, while AWS continues to be the cloud leader.
Apple Services are on the decline, BYD is putting pressure on Tesla. Amazon's European reputation has also been gaining more scrutiny.

Sentiment:
Tesla: Positive
- Apple: Neutral
Amazon: Positive

Growth opportunities and Risks:
Tesla
 - Opportunity: Maintain Model Y momentum, scale up production and price to increase volume.
 The increasing competition by BYD may put pressure on share and margins.
Apple
 Opportunity: Monetize the large iPhone user base, re-accelerate service delivery via bundles or ecosystem engagement.
 Risks: Softening of services; reliance on iPhone to drive top line growth.
Amazon Web Services (AWS).
 Opportunity: Use leadership to secure more AI workloads for enterprises and long-term commitments.
 - Risques: European regulation scrutiny can lead to penalties, costs of compliance, and contract/pricing restrictions.

Confidence Score: 7/10

JSON Prompt

json_prompt = """
The following is a market update that you can analyze and then return your response to in JSON format.

{
  "summary": "2-line market overview",
  "companies": [
     negative 
  ],
  "confidence_score": "integer (0-10)"
}

Market Text
Tesla's Q2 results exceeded expectations thanks to increased Model Y sales. However, rising BYD competition is a threat.
Apple's revenue is growing steadily, driven by iPhone sales. However, services revenues are slightly down.
Amazon AWS is still the dominant cloud computing provider in Europe, despite increased regulatory scrutiny.
"""

response_json = client.chat.completions.create(
    model="gpt-5",
    messages=[
        {"role": "system", "content": "You are a precise assistant that always outputs valid JSON."},
        {"role": "user", "content": json_prompt}
    ]
)

json_output = response_json.choices[0].message.content
print(json_output)

{
  "summary": "Markets saw mixed corporate updates: Tesla beat expectations on strong Model Y sales and AWS maintained cloud leadership.nHowever, Apple's growth was tempered by softer services revenue while Tesla and AWS face competition and regulatory risks.",
  "companies": [
    {
      "name": "Tesla",
      "sentiment": "positive",
      "opportunities": [
        "Leverage strong Model Y demand to drive revenue and scale production",
        "Sustain earnings momentum from better-than-expected Q2 results"
      ],
      "risks": [
        "Intensifying competition from BYD",
        "Potential price pressure impacting margins"
      ]
    },
    {
      "name": "Apple",
      "sentiment": "neutral",
      "opportunities": [
        "Build on steady iPhone-driven revenue growth",
        "Revitalize Services to reaccelerate growth"
      ],
      "risks": [
        "Slight decline in services revenue",
        "Reliance on iPhone as the primary growth driver"
      ]
    },
    {
      "name": "Amazon (AWS)",
      "sentiment": "positive",
      "opportunities": [
        "Capitalize on cloud leadership to win new enterprise workloads",
        "Expand higher-margin managed services and deepen customer spend"
      ],
      "risks": [
        "Increasing regulatory scrutiny in Europe",
        "Potential compliance costs or operational restrictions"
      ]
    }
  ],
  "confidence_score": 8
}

It was difficult to integrate the free-form model into workflows or parse it programmatically because of its lack of structure.

The JSON result on the contrary, allowed the user to control the format of the results. It ensured clean and machine-readable output with fields for the summary, sentiments, opportunities, threats, confidence scores, etc. This structured approach not only simplifies downstream processing — for dashboards, automated alerts, or data pipelines — but also guarantees consistency across responses. Users can guide the model by defining fields in advance, which reduces ambiguity, and improves reliability. See the FULL CODES here.

Templates for JSON prompts can be reused to increase scalability.

By pre-defining fields, the team can produce consistent outputs for APIs, apps, and databases that don’t require manual formatting. The standardization of fields not only speeds up the workflow, but ensures that results are reliable and repeatable. This makes collaboration and automation easier across all projects.


Click here to find out more FULL CODES here The following are some examples of how to get started: Notes. Check out our website to learn more. GitHub Page for Tutorials, Codes and Notebooks. Also, feel free to follow us on Twitter Don’t forget about our 100k+ ML SubReddit Subscribe now our Newsletter.


I’m a Civil Engineering graduate (2022) at Jamia Millia Islamia in New Delhi. I have an interest in Data Science and in particular, Neural Networks, as well as their applications in different areas.

coding x
Share. Facebook Twitter LinkedIn Email
Avatar
Gavin Wallace

Related Posts

Deepgram Python SDK Implementation for Transcription and Async Processing of Audio, Async Text Intelligence, and Async Text Intelligence.

25/04/2026

DeepSeek AI releases DeepSeek V4: Sparse attention and heavily compressed attention enable one-million-token contexts.

24/04/2026

OpenMythos Coding Tutorial: Recurrent-Depth Transformers, Depth Extrapolation and Mixture of Experts Routing

24/04/2026

OpenAI Releases GPT-5.5, a Absolutely Retrained Agentic Mannequin That Scores 82.7% on Terminal-Bench 2.0 and 84.9% on GDPval

24/04/2026
Top News

Pro-Iran Meme Machine Trolls Trump with AI Lego Cartoons

Extropic is aiming to disrupt the Data Center Bonanza

AI Humanoids are Here: Move aside, chatbots!

Why Anthropic’s New AI Model Sometimes Tries to ‘Snitch’

A Yann LeCun–Linked Startup Charts a New Path to AGI

Load More
AI-Trends.Today

Your daily source of AI news and trends. Stay up to date with everything AI and automation!

X (Twitter) Instagram
Top Insights

Vaex: A Guide for Building a Scalable Machine Learning and Analytics Pipeline that can Process Millions of Records

03/03/2026

Sam Altman’s House allegedly attacked by Molotov Cocktail thrower

10/04/2026
Latest News

Deepgram Python SDK Implementation for Transcription and Async Processing of Audio, Async Text Intelligence, and Async Text Intelligence.

25/04/2026

DeepSeek AI releases DeepSeek V4: Sparse attention and heavily compressed attention enable one-million-token contexts.

24/04/2026
X (Twitter) Instagram
  • Privacy Policy
  • Contact Us
  • Terms and Conditions
© 2026 AI-Trends.Today

Type above and press Enter to search. Press Esc to cancel.