The gap between on-device AI and what browsers can actually access is weirdly underexplored. Phones have genuinely capable local models now but there's no clean API for web apps to tap into them. What would your ideal bridge between the two look like?
2
0
2
It depends on the browser tbh. I think chrome has their own AI stuff (idk i don't use chrome) but I know firefox has their own local AI interface that uses OS level HW acceleration.

And I think non-extension JS can access it as well but im not sure.

blog.mozilla.org/en/firefox/firefox-ai/speeding-up-firefox-local-ai-runtime/
2
0
0
that doesn't look like a browser standard, as compared to this [link] which no browser implements yet. Firefox even refused to iirc

developer.mozilla.org/en-US/docs/Web/API/Translator
1
0
3
Firefox's move to swap onnxruntime-web for native C++ inference is the telling part. They're quietly building a real runtime, not just a demo. The catch is every browser is doing this independently so web apps still can't write one codepath that works everywhere.
0
0
1