One of the fun parts of being a tech journalist for over a decade is that occasionally you get to watch a truly unique gadget go from concept to reality — and, eventually, into your living room. That’s the case with the Mui Board, a smart home controller built into a piece of wood.
Artificial Intelligence
The Mui Board is a delightful, expensive way to control your smart home
The Mui was first demoed at CES in 2019, and I’ve seen it at several shows over the years, in various iterations, always with the promise that it would ship soon. Well, this year it did (in limited quantities!), and I finally got to try one out in my living room.
You could call the Mui Board an anti-smart display. A Raspberry Pi-based smart home controller with Matter support, it’s just a piece of wood on your wall when not in use. But when you touch its capacitive surface, glowing dot-matrix icons appear, letting you control your music and lighting, set timers, view messages, and access other information you might use a smart display for.
The idea is that technology should blend into your home — be calm, accessible, and integrated, not jarring or distracting. Mui is a Japanese term for being in harmony with nature, and the device is certified by the Calm Tech Institute. As someone surrounded by screens and smart displays in my home, I have long been intrigued by Mui’s alternative approach and was excited to try it out.
I’ve had the Mui Board second gen in my home for a few days now, and while I haven’t had a chance to fully put it through its paces, I’ve been impressed by its responsiveness, thoughtful design, and ambitious scope.
The second-gen model debuted in 2023 but only became available for purchase earlier this year on Mui and Indiegogo. At $999 (on sale for $799), the Mui Board still feels closer to a beautifully realized concept than a mass-market product. It’s hard to imagine many people spending a thousand dollars on a smart home controller — but for the right person, it’s undeniably delightful.
A simpler smart home controller
1/7
Created by Japan-based Mui Labs, the Mui Board comes in two colors: natural maple or dark cherry. I am testing the maple version and have mounted it on the wall in my living room, just above my sofa.
From there, I can reach up and tap it to see the time or weather, check what’s playing on my Sonos system, control my Hue lights, set a timer, and see the latest headlines from The Verge via a neat RSS feed feature.
It’s a simple interface with minimalist icons that offer more control than you might expect at first glance. You can dim and turn lights on and off (no color changing), adjust a thermostat’s temperature, mode, and fan speed, open and close curtains and locks, and turn smart plugs on and off. What’s different from other smart home controllers with touch interfaces is that there are no ads, no chatty voice assistant, and no bright, distracting display vying for my attention.
- Price: $999
- Dimensions: 23 x 3 x 1 inches
- Connectivity: Wi-Fi (2.4Ghz), BLE
- Smart home protocols: Matter Controller, Echonet Lite, Web API
- Installation: Wall-mounted
- Hardware: Speaker and microphone
Out of the box, the Mui Board connects to Wi-Fi and can display the time and weather. You can set timers, alarms, and reminders, and send messages to other Mui boards or the Mui app. It has two small built-in speakers and a microphone for recording messages, and is powered by an included AC adapter.
Touching and drawing on the Mui Board is its best feature. It’s responsive and easy to use, and it has some delightful features. A small cat roams around the board and changes direction when you tap it. There’s a piano / drum mode that turns the board into a music machine, and you write messages on it by hand, not with a keyboard. It feels like technology you can play with.
For smart home control, the Mui Board supports Matter and integrates with Sonos, Philips Hue, LIFX, SwitchBot, Ecobee, and Google Calendar APIs, among others, plus several Japanese services, including Radiko and the Echonet smart home protocol.
Using the fairly simple Mui app, I connected the board to my Sonos system, and it displayed the title of the currently playing track, volume, and playback controls, and I could skip forward or back in my playlist.
As a Matter controller, it can set up and control supported Matter devices directly, without using Amazon, Google, Apple, or similar services. It works with lighting, plugs, and thermostats, with locks next on the roadmap.
It currently supports Matter over Wi-Fi, with Thread support planned. There’s no Thread radio onboard, so you’ll need a third-party border router. Lighting works best so far, while the other categories are still in limited testing.
1/3
I successfully added a Meross Matter Wi-Fi smart plug directly to the board but struggled to pair some Matter devices, such as my Nest thermostat, using Matter’s multi-admin feature. I plan to do more testing here.
I was able to connect my Hue lighting setup and could turn on and off all the lights or control each individually on the board. It was fairly slow, however, probably because I have a large Hue setup, and it was using the cloud API rather than a local connection. I really liked the Veil of Night feature that lets you draw a line on the board to set a timer that gradually dims the lighting.
1/5
Ultimately, the Mui Board is a big button / switch for controlling your smart home, with the benefit of icons to guide you to what you’re doing without a bright screen. As with any icon-based control, it takes a bit of learning to remember which icon does what, but you can customize the layout to put your most-used controls on the first screen and scroll through additional screens for more options.
While I love the natural, furniture-like feel, when the board is off, it slightly resembles a two-by-four mounted on the wall — as if my husband abandoned a DIY project halfway through. A rounded edge, more sculptural profile, or small shelf accessory would go a long way toward softening the look. Also, there’s the issue of what to do with the cable.
I installed the Mui in our living room, but after using it for a few days, I think the ideal place for this is in a bedroom, over a bedside table or headboard. That’s the room most people would like to keep screens out of, yet still want to control lights, locks, music, etc., without reaching for a phone or using voice.
I don’t see using Mui Board exclusively as a smart home controller; its software doesn’t feel quite there yet for creating scenes and automations. But as an interface to your smart home, it’s a breath of fresh air. And as a fun device for controlling music, setting timers, and playing the piano, it’s successful, if very, very expensive.
It’s been a slow burn to get the Mui Board to where it is today, and I’m intrigued to see where this thoughtful company takes its calm tech concept next. Mui Labs will be at CES again this year, where the company plans to debut a new well-being-focused sleep experience and “movement-based lighting control” for the Mui Board.
Photos and video by Jennifer Pattison Tuohy / The Verge
Artificial Intelligence
Apptio: Why scaling intelligent automation requires financial rigour
Greg Holmes, Field CTO for EMEA at Apptio, an IBM company, argues that successfully scaling intelligent automation requires financial rigour.
The “build it and they will come” model of technology adoption often leaves a hole in the budget when applied to automation. Executives frequently find that successful pilot programmes do not translate into sustainable enterprise-wide deployments because initial financial modelling ignored the realities of production scaling.
“When we integrate FinOps capabilities with automation, we’re looking at a change from being very reactive on cost management to being very proactive around value engineering,” says Holmes.
This shifts the assessment criteria for technical leaders. Rather than waiting “months or years to assess whether things are getting value,” engineering teams can track resource consumption – such as cost per transaction or API call – “straight from the beginning.”
The unit economics of scaling intelligent automation
Innovation projects face a high mortality rate. Holmes notes that around 80 percent of new innovation projects fail, often because financial opacity during the pilot phase masks future liabilities.
“If a pilot demonstrates that automating a process saves, say, 100 hours a month, leadership thinks that’s really successful,” says Holmes. “But what it fails to track is that the pilot sometimes is running on over-provisioned infrastructure, so it looks like it performs really well. But you wouldn’t over-provision to that degree during a real production rollout.”
Moving that workload to production changes the calculus. The requirements for compute, storage, and data transfer increase. “API calls can multiply, exceptions and edge cases appear at volume that might have been out of scope for the pilot phase, and then support overheads just grow as well,” he adds.
To prevent this, organisations must track the marginal cost at scale. This involves monitoring unit economics, such as the cost per customer served or cost per transaction. If the cost per customer increases as the customer base grows, the business model is flawed.
Conversely, effective scaling should see these unit costs decrease. Holmes cites a case study from Liberty Mutual where the insurer was able to find around $2.5 million of savings by bringing in consumption metrics and “not just looking at labour hours that they were saving.”
However, financial accountability cannot sit solely with the finance department. Holmes advocates for putting governance “back in the hands of the developers into their development tools and workloads.”
Integration with infrastructure-as-code tools like HashiCorp Terraform and GitHub allows organisations to enforce policies during deployment. Teams can spin up resources programmatically with immediate cost estimates.
“Rather than deploying things and then fixing them up, which gets into the whole whack-a-mole kind of problem,” Holmes explains, companies can verify they are “deploying the right things at the right time.”
When scaling intelligent automation, tension often simmers between the CFO, who focuses on return on investment, and the Head of Automation, who tracks operational metrics like hours saved.
“This translation challenge is precisely what TBM (Technology Business Management) and Apptio are designed to solve,” says Holmes. “It’s having a common language between technology and finance and with the business.”
The TBM taxonomy provides a standardised framework to reconcile these views. It maps technical resources (such as compute, storage, and labour) into IT towers and further up to business capabilities. This structure translates technical inputs into business outputs.
“I don’t necessarily know what goes into all the IT layers underneath it,” Holmes says, describing the business user’s perspective. “But because we’ve got this taxonomy, I can get a detailed bill that tells me about my service consumption and precisely which costs are driving it to be more expensive as I consume more.”
Addressing legacy debt and budgeting for the long-term
Organisations burdened by legacy ERP systems face a binary choice: automation as a patch, or as a bridge to modernisation. Holmes warns that if a company is “just trying to mask inefficient processes and not redesign them,” they are merely “building up more technical debt.”
A total cost of ownership (TCO) approach helps determine the correct strategy. The Commonwealth Bank of Australia utilised a TCO model across 2,000 different applications – of various maturity stages – to assess their full lifecycle costs. This analysis included hidden costs such as infrastructure, labour, and the engineering time required to keep automation running.
“Just because of something’s legacy doesn’t mean you have to retire it,” says Holmes. “Some of those legacy systems are worth maintaining just because the value is so good.”
In other cases, calculating the cost of the automation wrappers required to keep an old system functional reveals a different reality. “Sometimes when you add up the TCO approach, and you’re including all these automation layers around it, you suddenly realise, the real cost of keeping that old system alive is not just the old system, it’s those extra layers,” Holmes argues.
Avoiding sticker shock requires a budgeting strategy that balances variable costs with long-term commitments. While variable costs (OPEX) offer flexibility, they can fluctuate wildly based on demand and engineering efficiency.
Holmes advises that longer-term visibility enables better investment decisions. Committing to specific technologies or platforms over a multi-year horizon allows organisations to negotiate economies of scale and standardise architecture.
“Because you’ve made those longer term commitments and you’ve standardised on different platforms and things like that, it makes it easier to build the right thing out for the long term,” Holmes says.
Combining tight management of variable costs with strategic commitments supports enterprises in scaling intelligent automation without the volatility that often derails transformation.
IBM is a key sponsor of this year’s Intelligent Automation Conference Global in London on 4-5 February 2026. Greg Holmes and other experts will be sharing their insights during the event. Be sure to check out the day one panel session, Scaling Intelligent Automation Successfully: Frameworks, Risks, and Real-World Lessons, to hear more from Holmes and swing by IBM’s booth at stand #362.
See also: Klarna backs Google UCP to power AI agent payments

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is part of TechEx and is co-located with other leading technology events including the Cyber Security & Cloud Expo. Click here for more information.
AI News is powered by TechForge Media. Explore other upcoming enterprise technology events and webinars here.
Artificial Intelligence
FedEx tests how far AI can go in tracking and returns management
FedEx is using AI to change how package tracking and returns work for large enterprise shippers. For companies moving high volumes of goods, tracking no longer ends when a package leaves the warehouse. Customers expect real-time updates, flexible delivery options, and returns that do not turn into support tickets or delays.
That pressure is pushing logistics firms to rethink how tracking and returns operate at scale, especially across complex supply chains.
This is where artificial intelligence is starting to move from pilot projects into daily operations.
FedEx plans to roll out AI-powered tracking and returns tools designed for enterprise shippers, according to a report by PYMNTS. The tools are aimed at automating routine customer service tasks, improving visibility into shipments, and reducing friction when packages need to be rerouted or sent back.
Rather than focusing on consumer-facing chatbots, the effort centres on operational workflows that sit behind the scenes. These are the systems enterprise customers rely on to manage exceptions, returns, and delivery changes without manual intervention.
How FedEx is applying AI to package tracking
Traditional tracking systems tell customers where a package is and when it might arrive. AI-powered tracking takes a step further by utilising historical delivery data, traffic patterns, weather conditions, and network constraints to flag potential delays before they happen.
According to the PYMNTS report, FedEx’s AI tools are designed to help enterprise shippers anticipate issues earlier in the delivery process. Instead of reacting to missed delivery windows, shippers may be able to reroute packages or notify customers ahead of time.
For businesses that ship thousands of parcels per day, that shift matters. Small improvements in prediction accuracy can reduce support calls, lower refund rates, and improve customer trust, particularly in retail, healthcare, and manufacturing supply chains.
This approach also reflects a broader trend in enterprise software, in which AI is being embedded into existing systems rather than introduced as standalone tools. The goal is not to replace logistics teams, but to minimise the number of manual decisions they need to make.
Returns as an operational problem, not a customer issue
Returns are one of the most expensive parts of logistics. For enterprise shippers, particularly those in e-commerce, returns affect warehouse capacity, inventory planning, and transportation costs.
According to PYMNTS, FedEx’s AI-enabled returns tools aim to automate parts of the returns process, including label generation, routing decisions, and status updates. Companies that use AI to determine the most efficient return path may be able to reduce delays and avoid returning things to the wrong facility.
This is less about convenience and more about operational discipline. Returns that sit idle or move through the wrong channel create cost and uncertainty across the supply chain. AI systems trained on past return patterns can help standardise decisions that were previously handled case by case.
For enterprise customers, this type of automation supports scale. As return volumes fluctuate, especially during peak seasons, systems that adjust automatically reduce the need for temporary staffing or manual overrides.
What FedEx’s AI tracking approach says about enterprise adoption
What stands out in FedEx’s approach is how narrowly focused the AI use case is. There are no broad claims about transformation or reinvention. The emphasis is on reducing friction in processes that already exist.
This mirrors how other large organisations are adopting AI internally. In a separate context, Microsoft described a similar pattern in its article. The company outlined how AI tools were rolled out gradually, with clear limits, governance rules, and feedback loops.
While Microsoft’s case focused on knowledge work and FedEx’s on logistics operations, the underlying lesson is the same. AI adoption tends to work best when applied to specific activities with measurable results rather than broad promises of efficiency.
For logistics firms, those advantages include fewer delivery exceptions, lower return handling costs, and better coordination between shipping partners and enterprise clients.
What this signals for enterprise customers
For end-user companies, FedEx’s move signals that logistics providers are investing in AI as a way to support more complex shipping demands. As supply chains become more distributed, visibility and predictability become harder to maintain without automation.
AI-driven tracking and returns could also change how businesses measure logistics performance. Companies may focus less on delivery speed and more on how quickly issues are recognised and resolved.
That shift could influence procurement decisions, contract structures, and service-level agreements. Enterprise customers may start asking not just where a shipment is, but how well a provider anticipates problems.
FedEx’s plans reflect a quieter phase of enterprise AI adoption. The focus is less on experimentation and more on integration. These systems are not designed to draw attention but to reduce noise in operations that customers only notice when something goes wrong.
(Photo by Liam Kevan)
See also: PepsiCo is using AI to rethink how factories are designed and updated
Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is part of TechEx and is co-located with other leading technology events, click here for more information.
AI News is powered by TechForge Media. Explore other upcoming enterprise technology events and webinars here.
Artificial Intelligence
Klarna backs Google UCP to power AI agent payments
Klarna aims to address the lack of interoperability between conversational AI agents and backend payment systems by backing Google’s Universal Commerce Protocol (UCP), an open standard designed to unify how AI agents discover products and execute transactions.
The partnership, which also sees Klarna supporting Google’s Agent Payments Protocol (AP2), places the Swedish fintech firm among the early payment providers to back a standardised framework for automated shopping.
The interoperability problem with AI agent payments
Current implementations of AI commerce often function as walled gardens. An AI agent on one platform typically requires a custom integration to communicate with a merchant’s inventory system, and yet another to process payments. This integration complexity inflates development costs and limits the reach of automated shopping tools.
Google’s UCP attempts to solve this by providing a standardised interface for the entire shopping lifecycle, from discovery and purchase to post-purchase support. Rather than building unique connectors for every AI platform, merchants and payment providers can interact through a unified standard.
David Sykes, Chief Commercial Officer at Klarna, states that as AI-driven shopping evolves, the underlying infrastructure must rely on openness, trust, and transparency. “Supporting UCP is part of Klarna’s broader work with Google to help define responsible, interoperable standards that support the future of shopping,” he explains.
Standardising the transaction layer
By integrating with UCP, Klarna allows its technology – including flexible payment options and real-time decisioning – to function within these AI agent environments. This removes the need for hardcoded platform-specific payment logic. Open standards provide a framework for the industry to explore how discovery, shopping, and payments work together across AI-powered environments.
The implications extend to how transactions settle. Klarna’s support for AP2 complements the UCP integration, helping advance an ecosystem where trusted payment options work across AI-powered checkout experiences. This combination aims to reduce the friction of users handing off a purchase decision to an automated agent.
“Open standards like UCP are essential to making AI-powered commerce practical at scale,” said Ashish Gupta, VP/GM of Merchant Shopping at Google. “Klarna’s support for UCP reflects the kind of cross-industry collaboration needed to build interoperable commerce experiences that expand choice while maintaining security.”
Adoption of Google’s UCP by Klarna is part of a broader shift
For retail and fintech leaders, the adoption of UCP by players like Klarna suggests a requirement to rethink commerce architecture. The shift implies that future payments may increasingly come through sources where the buyer interface is an AI agent rather than a branded storefront.
Implementing UCP generally does not require a complete re-platforming but does demand rigorous data hygiene. Because agents rely on structured data to manage transactions, the accuracy of product feeds and inventory levels becomes an operational priority.
Furthermore, the model maintains a focus on trust. Klarna’s technology provides upfront terms designed to build trust at checkout. As agent-led commerce develops, maintaining clear decisioning logic and transparency remains a priority for risk management.
The convergence of Klarna’s payment rails with Google’s open protocols offers a practical template for reducing the friction of using AI agents for commerce. The value lies in the efficiency of a standardised integration layer that reduces the technical debt associated with maintaining multiple sales channels. Success will likely depend on the ability to expose business logic and inventory data through these open standards.
See also: How SAP is modernising HMRC’s tax infrastructure with AI
Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is part of TechEx and is co-located with other leading technology events including the Cyber Security & Cloud Expo. Click here for more information.
AI News is powered by TechForge Media. Explore other upcoming enterprise technology events and webinars here.
-
Fintech6 months agoRace to Instant Onboarding Accelerates as FDIC OKs Pre‑filled Forms | PYMNTS.com
-
Cyber Security7 months agoHackers Use GitHub Repositories to Host Amadey Malware and Data Stealers, Bypassing Filters
-
Fintech6 months ago
DAT to Acquire Convoy Platform to Expand Freight-Matching Network’s Capabilities | PYMNTS.com
-
Fintech5 months agoID.me Raises $340 Million to Expand Digital Identity Solutions | PYMNTS.com
-
Artificial Intelligence7 months agoNothing Phone 3 review: flagship-ish
-
Artificial Intelligence7 months agoThe best Android phones
-
Fintech4 months agoTracking the Convergence of Payments and Digital Identity | PYMNTS.com
-
Fintech7 months agoIntuit Adds Agentic AI to Its Enterprise Suite | PYMNTS.com


