The latest Triple-i Initiative Showcase is nearly upon us, as the indie-focused event returns for its third consecutive year on Thursday, April 9 at 12PM ET / 9AM PT. We’re being promised announcements for 40 games, including eight world premieres, so it’s well worth tuning in if you like your indies. You’ll be able to watch the stream on The Triple-i Initiative YouTube channel, as well as Twitch , bilibili , niconico and on Steam . Co-streaming partners IGN and Gamespot will also host their own streams. The showcase will run for 45 minutes, and nine featured studios will also have post-show deep dives on their games if you want to know more. As previously, the mantra here is "no hosts, no ads, just games," so rest assured your attention will be rewarded. Confirmed featured games so far include Risk of Rain 2 , the open-world survival game Windrose and Castlevania: Belmont’s Curse . We also know that the studio behind the excellent sci-fi narrative adventure 1000xResist will be showing off what it’s been working on, and we can also expect news from Cairn developer The Game Bakers. It sounds like a typically eclectic lineup, then, and given last year’s showcase gave us release dates for 2025 indie hits like The Alters and Rematch , you can be confident that plenty of notable news should come out of this one too. This article originally appeared on Engadget at https://www.engadget.com/gaming/how-to-watch-the-triple-i-initiative-showcase-on-april-9-170353957.html?src=rss
Following the icy reception to Llama 4 , Meta is releasing the first in a new family of AI systems built by its recently formed Superintelligence team. The company is kicking off its new Muse era with Spark , a lightweight model geared toward consumer use. In the future, Meta plans to offer more capable versions of Muse, but for now, it's clear the company wants to nail the basics. To that point, many of Spark's capabilities are table stakes for a new model in 2026. For instance, it offers both "Instant" and "Thinking" modes. With the latter engaged, the model will take an extra few moments to reason through a prompt. Other consumer-facing AI systems have had this kind of flexibility for a while. Anthropic, for example, was one of the first AI labs to offer a "hybrid reasoning model" when it released Claude Sonnet 3.7 at the start of last year. That said, Meta plans to add an even more powerful "Contemplating" mode down the road. A GIF demonstrating Muse Spark's multi-agent capabilities. Meta Muse Spark can also coordinate multiple AI subagents to tackle a request. Meta suggests users will see this capability in action when they ask for help with tasks like family trip planning. In such a scenario, one agent might compile an itinerary, while another finds kid-friendly activities everyone can enjoy. At the same time, Meta has built Spark to be natively multimodal, meaning the model can process images, video and audio. Like Google Lens , this gives you the option to snap a photo with your phone and ask Meta AI questions about what you see. Of course, it wouldn't be a 2026 AI release if Muse Spark didn't include a built-in shopping assistant. Like ChatGPT , Spark can compare different items for you, listing the pros and cons of each, with links to make it easy to buy the product that appeals to you. Muse Spark is available today in the Meta AI app and meta.ai website everywhere where the company offers those services. Meta will begin rolling out the new features the model powers starting in the US. In the coming weeks, the company plans to bring Muse Spark to more countries and places where people can access Meta AI, including Facebook, Instagram and WhatsApp. Additionally, Meta says it "hopes to open source future versions of the model." We'll see if the company ends up doing that; last year, Meta CEO Mark Zuckerberg appeared to flip flop on the company's open source stance , saying it would need to be more "rigorous" about such decisions moving forward. This article originally appeared on Engadget at https://www.engadget.com/ai/metas-muse-spark-model-brings-reasoning-capabilities-to-the-meta-ai-app-161456684.html?src=rss