• Practically AI
  • Posts
  • 🧠 AI Goes Pro: Health, Hardware, and the Future of Work

🧠 AI Goes Pro: Health, Hardware, and the Future of Work

Today in AI: OpenAI launches ChatGPT Health, NVIDIA goes deeper into AI infrastructure, and Gumroad reveals how it actually codes with AI.

šŸ‘‹ Hello hello,

This week in AI feels… oddly human.

OpenAI wants to help you prep for doctor visits. Lenovo’s new assistant is basically your laptop’s plus-one. And NVIDIA’s teaming up with the music industry — not to replace artists, but to give them better tools.

The theme? AI’s finally starting to play nice with the real world.

Let’s dive in.

šŸ”„šŸ”„šŸ”„ Three big updates

OpenAI introduced ChatGPT Health, a version of ChatGPT tailored specifically for health and wellness. Users can connect medical records, apps, and health data to get advice on things like doctor visit prep, interpreting lab results, fitness plans, and diet suggestions. It’s designed to be informational and supportive, not a replacement for physicians or licensed professionals. Early reactions applaud its potential to help people navigate healthcare more easily — though some developers note it overlaps with other health AI initiatives and won’t replace expert care. 

At CES 2026, Lenovo (and its Motorola division) unveiled Qira, a cross-device personal AI agent built into systems instead of just an app. Qira isn’t a chatbot you open — it’s ambient intelligence deeply integrated across laptops, tablets, phones, and wearables, with awareness of context and continuity as you move between tasks.  

Universal Music Group (UMG) and NVIDIA announced a strategic collaboration to extend NVIDIA’s Music Flamingo model — an AI system that understands music nuance like harmony, structure, timbre, lyrics, and cultural context — into UMG’s catalog and creative workflows. The deal is big not for hype, but for its emphasis on responsible AI development that supports artists, protects copyrights, and gives creators new tools for discovery, analysis, and engagement with fans. 

Part of the partnership includes an artist incubator where musicians, producers, and songwriters help co-design and test AI tools — aiming to avoid generic, low-quality AI output and instead unlock deeper, more meaningful music interactions powered by AI. 

Instead of wrestling with sliders and lighting rigs in separate apps, Relight lets creators shape the look and feel of scenes quickly and with precision — a win for photographers, videographers, and social creators alike.

šŸ”„šŸ”„ Two tools worth trying

Ralph is an autonomous AI coding loop gaining buzz in dev circles. It repeatedly runs your chosen agent (like AmpCode) until tasks are complete, managing fresh context windows each iteration and keeping memory in git history and text files. Builders report shipping real features overnight with Ralph’s loop — a glimpse at what hands-off coding workflows could look like.

If you’re curious how an established tech company actually builds with AI, Gumroad is the blueprint. Under Sahil Lavingia’s ā€œanti-workā€ philosophy, Gumroad has quietly become an AI-first product company — where agents write, review, and ship production-level code.

Their workflow blends autonomous coding agents, prompt-based product development, and human oversight loops to balance creativity with control. The approach has helped the team iterate faster, make sharper product bets, and rethink what ā€œbuilding softwareā€ even means.

(Watch this podcast episode for more details.)

šŸ”„ A workflow that will save you hours

Here’s a practical process to create a professional scroll-triggered animation (the kind designers used to charge thousands for) in about 10 minutes:

1ļøāƒ£ Generate start and end frames

Prompt an image model for two key visuals — a regular frame and a detailed ā€œexplodedā€ version — that you’ll use as the basis for your animation.

2ļøāƒ£ Create a video from the frames

Use a tool like Google Frameworks for Video (or similar) to turn those two frames into a short video clip.

3ļøāƒ£ Extract frames to images

Convert that clip into a sequence of individual frames (e.g., with EasyGIf or similar), generating smooth steps between your visuals.

4ļøāƒ£ Import frames into code

In a Next.js or similar project, place the sequence of images in your public folder. Then build a scroll component that loads and advances images based on scroll position — making the chip explode as users scroll.

The result: a smooth, scroll-driven animation that feels modern and engaging, built with familiar tools instead of expensive design work.

(Watch the full demo.)

Do you like this new format?

Login or Subscribe to participate in polls.

šŸ’¬ Quick poll: What's the AI tool you use daily that nobody talks about?

Hit reply — We're always hunting for underrated gems.

Until next time,
Kushank @DigitalSamaritan

1 

Reply

or to participate.