Forgot your password?
typodupeerror
Mozilla AI

Mozilla 'Thunderbolt' Is an Open-Source AI Client Focused On Control and Self-Hosting 23

BrianFagioli writes: Mozilla's email subsidiary MZLA Technologies just introduced Thunderbolt, an open-source AI client aimed at organizations that want to run AI on their own infrastructure instead of relying entirely on cloud services. The idea is to give companies full control over their data, models, and workflows while still offering things like chat, research tools, automation, and integration with enterprise systems through the Haystack AI framework. Native apps are planned for Windows, macOS, Linux, iOS, and Android. Thunderbolt allows organizations to do the following:
- Run AI with their choice of models, from leading commercial providers to open-source and local models
- Connect to systems and data: Integrate with pipelines and open protocols, including: deepset's Haystack platform, Model Context Protocol (MCP) servers, and agents with the Agent Client Protocol (ACP)
- Automate workflows and recurring tasks: Generate daily briefings, monitor topics, compile reports, or trigger actions based on events and schedules
- Work seamlessly across devices with native applications for Windows, macOS, Linux, iOS, and Android
- Maintain security with self-hosted deployment, optional end-to-end encryption, and device-level access controls

Mozilla 'Thunderbolt' Is an Open-Source AI Client Focused On Control and Self-Hosting

Comments Filter:
  • by SlashbotAgent ( 6477336 ) on Friday April 17, 2026 @04:15PM (#66099112)

    How does this compare or where does this fit when considering the likes of Ollama or openClaw?

    For that matter, do any of these matter or should one simply install Claude Desktop?

    • by caseih ( 160668 ) on Friday April 17, 2026 @04:41PM (#66099158)

      While Ollama does have a GUI, as I understand it and use it, it's primary purpose is to download, host and run local LLM models. I always use ollama from the command prompt. You usually interact with models hosted in Ollama with a separate agent.

      I suppose Thunderbolt compares roughly to Claude Desktop, AnythingLLM, or Qwen's desktop GUI.

      Running Thunderbolt or AnythingLLM with a locally-hosted Gemma 4 model is kind of interesting, possibly useful. Gemma4 is quite amazing for being able to run on lower-powered machines. I can run Gemma4:26B on an nVidia GPU with 12 GB VRAM pretty well, although startup time is about 3 minutes. I've only tested it lightly. It seems to be able to make sense of simple C language programs, can translate languages, and analyze and summarize documents. quite useful actually.

      I personally prefer the CLI agents like OpenCode. I suppose openclaw could be in this camp, but useful agents to me have lots of checks to prevent agents from wreaking havoc without intervention.

    • Probably somewhere along the ditch where real RAG with properly managed self-hosted models have blown past them. Ollama + gui is not much of a workflow. You need some type of embedding/vector db system, a web search engine/scraper so the model can make tool calls to the web, some type of hybrid search with a re-ranking model, and then the final generation model. A typical stack is going to be something like openweb-ui + litellm + vllm on the backend.

      Or yeah, just use Claude Desktop and Anthropics
    • by znrt ( 2424692 )

      How does this compare or where does this fit when considering the likes of Ollama or openClaw?

      For that matter, do any of these matter or should one simply install Claude Desktop?

      claude desktop is essentially your llm chat loop (you write a prompt, llm prints a result, you copy/paste what you want, rinse and repeat) with some handy features: you can create separate projects with prompt history and global rules, you can upload source files or you can give it read access to your github repos and it produces files you can then diff directly to your own. it supports anthropic's models and you can use sonnet 4.6 for free (which is quite good).

      llama.cpp is just a model runner for any mod

  • great naming choice (Score:5, Informative)

    by d4fseeker ( 1896770 ) on Friday April 17, 2026 @04:28PM (#66099140)
    Having something that both resembles in name to Mozilla Thunderbird and Intel Thunderbolt but is neither... Probably an AI came up with that name and idea
    • Yeah I voted for Thunderzilla but no takers sadly.
    • Seeing the name made me momentarily forget the name of the email client. Took me several seconds to be able to recall "Thunderbird" even though it's open right in front of me.

    • by vux984 ( 928602 )

      Nah, Mozilla has been naming things after things that already exist for years.

      Remember Mozilla Phoenix? Mozilla Firebird?

      https://www.mozillazine.org/fi... [mozillazine.org]

    • Exactly.

      Somebody said they spent a lot of time and money coming up with a new "nonbinary" mascot.

      I started using Firefox over 20 years ago and never had any idea whether the fox was male or female. Since thunderbirds are fictional I'm not sure about the sexual dimorphism of their plumage.

      It's hard to understand how they think these days. At least Ladybird will offer a second rendering engine when the bubble pops. Engineering-focused software organizations used to be the norm.

      I guess a Ladybird is a femal

    • Having something that both resembles in name to Mozilla Thunderbird and Intel Thunderbolt but is neither... Probably an AI came up with that name and idea

      It definitely could have been worse: they could have taken a page out of their competitor's book and named three different products the same name.

    • by radoni ( 267396 )

      Mozilla Firefox ripped out WebUSB support and closed as WONTFIX. Now they've got a product named after a USB related standard? WTAF

  • by jenningsthecat ( 1525947 ) on Friday April 17, 2026 @10:35PM (#66099538)

    If memory and processor prices ever become sane and reasonable again, could this be the end of the AI bubble? If free-as-in-beer-AND-speech models are readily available, and if the computing power required to run them is affordable, what do the major AI merchants who've been inflating the bubble have to offer?

    Sure, there's the training time and effort. But just as the internet spelled the end of having no choice but to pay for music and other media, won't it also be the way in which the training data that AI companies already stole can be re-stolen by folks who are running their own LLMs on their own hardware?

    • by Anonymous Coward

      The longer people talk about a bubble, the less credible it gets. Do you know how long typical bubbles last? Either start talking later about bubbles, or make it burst already. That thing is pretty stable at the moment ...

    • There is no AI bubble.

      We are in the 6th Kondratieff.

      No idea the /. crowd insists not to grasp that.

      This is the 6th mega revolution in technology for mankind.

      And you insist it is a bubble.

      How stupid are you?

      You have the chance to be part of it, ride it, be the Shockwave rider.

      But you neglect to learn the simplest things about this technology.

      You are so absurd, it is unbelievable.

  • They could have made something to save and organize webpages, but instead, they brought more stinky AI into the world.
  • Brilliant name this won't create any confusion what so ever" Here is a hint mozilla dont use a name that is allready used in the tech industry ThunderBolt is a technology for connecting peripherals to computers (eas I know about the different capitalization but with the quality of journalism nowadays I suspect that this detail will be lost and unnesecary confusion wil be a fact

Loose bits sink chips.

Working...