Fitbit Adds Food and Water Tracking | Analysis by Brian Moineau

Hook: Fitbit gets hungrier — and thirstier — for your data

Today’s Fitbit update is more than a fresh coat of paint. The Fitbit Public Preview adds food & water logging, joining a broader app redesign and AI-powered personal health coach that Google has been rolling out in preview form. If you’ve been watching the gradual migration of Fitbit into Google’s ecosystem, this is one of those moments where the product starts to feel like the future Google described — and also like the kind of change that will stir conversation among longtime users.

What just landed in the Public Preview

  • The app now includes built-in food logging and water tracking so users can set calorie targets, log meals, and track hydration directly in the Fitbit app.
  • The Public Preview — originally focused on Premium subscribers and select Android users — is expanding access so free-tier users can try the redesigned interface and these nutrition features.
  • This expands a broader push: the redesigned app pairs a Material 3-inspired UI with a Gemini-powered “personal health coach” that uses your activity, sleep, and (now) nutrition data to give suggestions.

Why this matters: nutrition and hydration are two of the largest behavioral levers for health outcomes. Bringing those logs into Fitbit’s new coaching experience is an obvious next step — it helps the AI see the whole picture, not just steps and sleep.

Why the timing and the rollout matter

Google started previewing the AI-powered Personal Health Coach last year, first to Premium users and a limited set of devices. The rollout has been gradual: Android users saw the earliest access, then iOS, and now more people on the free tier are being invited into the Public Preview.

That phased approach is pragmatic. It lets Google collect feedback, quiet bugs, and iterate on features that touch sensitive user data — especially when the product starts to take in things like nutrition entries and (in other recent previews) medical records or continuous glucose monitor data.

Still, phased rollouts create friction: some users will see new nutrition and water screens immediately; others will wait days or weeks. And historically, Fitbit’s food/water logging has been a touchy subject for users when it’s buggy or when sync behavior with third-party apps breaks.

The redesign: not just cosmetics

  • Material 3 visuals, smoother animations, and a reorganized home experience aim to make daily logging simpler.
  • The Personal Health Coach (Gemini-based) turns logs into conversational guidance: it can suggest adjustments, summarize patterns, and help set targets.
  • Beyond nutrition, Google is adding resilience and sleep improvements, and plans to let eligible users link clinical records for a fuller health snapshot.

Put simply: Fitbit now wants to be both the place you record what you do and the place that explains what it means. That double role increases the product’s value — and the stakes.

What users should watch for

  • Data continuity: If you have historic food and water entries, confirm those sync correctly. Some preview users historically reported migration hiccups after big app updates.
  • Privacy and permissions: New features that ingest nutrition, hydration, and (in other previews) medical data mean you should double-check which Google/Fitbit account type is linked and which permissions you’ve granted.
  • Feature parity: The Public Preview sometimes exposes a UI before all back-end pieces are in place. Expect some functionality to behave differently or appear later.
  • Integration with third-party food trackers: If you rely on MyFitnessPal, Lose It!, or a smart scale to feed Fitbit, watch whether those integrations continue to sync smoothly.

A quick user checklist

  • Update the Fitbit app to the latest version from your app store.
  • Open Settings → Profile → Join Public Preview (if available) to get access.
  • Back up or note important historical data if you depend on it daily.
  • Review app permissions and the account linked to Fitbit (Google vs. legacy Fitbit account).

The broader picture

This update is a predictable but meaningful step in Fitbit’s evolution under Google. AI coaching without context is limited; nutrition and hydration bring context. Google is clearly aiming to stitch together device data, user-entered behavior, and — at times — clinical data to create a more personalized experience.

But that integration raises familiar trade-offs: convenience versus control, helpful nudges versus surprising recommendations, and the long-standing tension between new platform design and the muscle memory of long-term users. Some will love having one place to log a meal and ask an AI why their readiness score dropped; others will bemoan changes to workflows that used to be simple and reliable.

My take

I’m encouraged by Fitbit bringing food and water logging into the Public Preview — the product only becomes useful if it measures the things that actually move the needle. That said, Google will need to keep listening. Small quality-of-life details (quick add buttons, barcode scanning, consistent units for water, and reliable third-party sync) often determine whether people actually keep logging.

If Google gets those details right and keeps the privacy guardrails clear, this could be one of the stronger examples of practical, helpful AI in wellness. If not, it’ll feel like a shiny interface on top of the same old friction.

Sources




Related update: We recently published an article that expands on this topic: read the latest post.


Related update: We recently published an article that expands on this topic: read the latest post.

Android 17 Beta 3 Embraces Frosted Blur | Analysis by Brian Moineau

A frosted sequel: Android 17 Beta 3 leans harder into blur

If you pulled your notification shade on a Pixel running Android 17 Beta 3 and thought, “Hey — that’s more… frosty,” you weren’t imagining things. Android 17 Beta 3 continues the translucency trend that Android 16 started, rolling out blur and frosted-glass effects across more system surfaces to create a deeper, layered UI experience. This shift is subtle in screenshots but immediately noticeable in motion: backgrounds peek through panels, volume controls and menus feel lifted from the wallpaper, and the whole UI gains a softer, more tactile appearance. (9to5google.com)

What Android 17 Beta 3 is changing (and why it matters)

  • Android 16 introduced translucency to areas like the notification shade, Quick Settings, and app drawer as part of Material 3 Expressive. Android 17 Beta 3 expands that vocabulary, applying blur more widely to system menus such as the volume panel, recents/overview, and other transient surfaces. (9to5google.com)

  • The visual aim is to add depth and context: instead of solid blocks of color, UI layers let you maintain a faint sense of what’s behind a panel. That guides focus without removing ambient cues — a design choice that can improve readability and polish when executed well. (9to5google.com)

  • Practically, these changes come via internal builds and leaked screenshots rather than an official announcement, so the final appearance and which elements get blurred could still shift before the stable release. (9to5google.com)

Transitioning from flat to frosted visuals is a design decision that influences more than aesthetics. It affects performance, battery use, accessibility, and how third-party apps should harmonize with system chrome.

Looking closer: the visual and technical trade-offs

Designers love blur because it creates hierarchy without hiding context. Users, meanwhile, will focus on three practical things: performance, consistency, and control.

  • Performance: Gaussian blur and real-time translucency can be GPU-heavy. On modern Pixels and flagship SoCs, this is usually fine, but older or budget devices may see frame drops or battery impacts when the system applies blur everywhere. Early beta reports from testers have already flagged occasional visual banding and inconsistent blur behavior during transitions. (reddit.com)

  • Consistency: Android’s strength is diversity — many OEMs skin and extend the platform. If Google bakes blur and translucency deeper into core APIs, OEMs and third-party apps may adopt it inconsistently, resulting in a fragmented look across devices. Conversely, a clearer Material guidance could unify the ecosystem. (androidauthority.com)

  • Control and accessibility: Not everyone wants motion, translucency, or extra visual effects. Accessibility settings (reduce motion, high contrast) must be respected, and users should be able to toggle or tone down blur without losing functionality. The beta conversations show mixed feelings from users: some praise the polish, others miss sharper contrast or report that blur sometimes disappears unexpectedly. (reddit.com)

Why this feels a lot like trends elsewhere

It’s not accidental that commentators are likening Android’s frosted look to Apple’s Liquid Glass and to UI flourishes from manufacturers like Samsung and OnePlus. Design trends ripple: once a visual approach proves clear and appealing, others iterate on it. Material 3 Expressive opened the door, and Android 17 feels like Google exploring where that language can go — while balancing the line between inspiration and imitation. Many outlets and design observers have already pointed out the resemblance. (tomsguide.com)

That said, Google’s execution matters: because Android supports so many hardware and software combinations, the company needs robust fallbacks and performance profiles so the same design language can translate across devices without slowing older hardware down.

What to watch in the coming months

  • Will blur be optional? Ideally, Android should expose a system-level toggle for blur intensity or a simple on/off, plus respect existing accessibility options.

  • Will Google provide developer guidance? If Material components and system surfaces begin to rely on translucency, developers will need clear guidelines for contrast, legibility, and animation timing.

  • How will the final build balance battery and GPU load? Expect iterative QPR (Quarterly Platform Release) updates or optimizations before the stable Android 17 to smooth performance and reduce artifacts like banding. Early tester reports already hint at such quirks. (reddit.com)

Android 17 Beta 3: what this means for everyday users

For most people who upgrade to Android 17 when it lands, the change will be mostly visual: settings panels, volume sliders, and other transient surfaces will feel softer and more "layered." That can make the OS feel fresher without changing workflows.

However, users of lower-specced devices or power-conscious folks should pay attention to early benchmarks and battery reports before upgrading, especially on betas. If blur becomes the default everywhere with no user control, that could frustrate a section of the user base. Early beta chatter suggests Google is still iterating. (9to5google.com)

My take

Design evolution is a balancing act. Android 17 Beta 3’s expanded blur is a logical next step after Android 16’s Material 3 Expressive work: it adds nuance, context, and a modern sheen that many users will appreciate. At the same time, Google must be pragmatic — offering opt-outs, ensuring smooth performance, and providing clear developer guidance. If it gets those elements right, Android will look cleaner and feel more cohesive; if not, the effect could come off as gratuitous fluff or create uneven experiences across devices.

Overall, I welcome the polish — but I’m watching for the controls and performance optimizations that will make that polish sustainable for everyone.

Sources




Related update: We recently published an article that expands on this topic: read the latest post.