Improving Our AI Assistant

Vue.js
AI SDK
TypeScript
Node.js

Redesigning Layers' AI assistant from a hidden drawer feature into an inline, action-based system with streaming responses and workflow-specific quick actions.

About The Project

Comunicados (Announcements) is one of the most used apps in our platform—it's how school administrators send posts to students and parents. Two years ago, we introduced LIA (Layers AI) to help administrators write better posts. But the original implementation was basic: a 200-character prompt input hidden behind a button in a drawer, which opened another drawer for the chat.

The goal was to modernize this experience. Make the AI feel like a natural part of the writing flow, not a detour.

The Challenge: Friction in the Workflow

The existing AI feature had adoption problems rooted in its design:

  • Hidden Away: Users had to click through multiple layers (drawer → button → another drawer) just to access the AI. Too many steps meant many users never bothered.
  • Generic Output: The AI took a simple prompt and returned a generic response. It didn't understand the context of what the user was trying to do—revise, expand, summarize, or start fresh.
  • No Streaming: Users typed a prompt, hit send, and waited. No feedback until the full response appeared, which felt slow and disconnected.

School administrators were leaving the platform to use ChatGPT or other tools, then copy-pasting results back. We were losing them at the moment they needed help most.

The Solution: Inline AI with an Action System

1. Bringing AI Closer to the Content

The biggest UX change was positioning. Instead of hiding the AI in a separate drawer, we moved it inline—right next to where the user writes their post description. This eliminated the context switch and made the AI feel like a writing companion rather than a separate tool.

2. Streaming Responses

I implemented real-time streaming using the AI SDK, so users see the response generating word by word. This small change made the AI feel faster and more responsive, even when the actual generation time was the same.

3. The Action System

This was the core architectural contribution. Instead of treating every AI interaction as a generic "prompt → response" flow, I designed a system where each UI interaction maps to a specific action.

For example:

  • The Revise quick action triggers the revise action, which takes the user's existing description, wraps it in a tailored prompt, and adjusts model parameters for editing-focused output.
  • The Expand action takes sparse notes and generates a fuller post.
  • The Shorten action recudes the size of the post while still maintainig the meaning.

Each action encapsulates its own prompt engineering, input requirements, and model settings. The UI just invokes the right action—it doesn't need to know the details.

4. Model Flexibility

By building on the AI SDK, we decoupled the system from any specific AI provider. The action system works the same whether we're using GPT, Claude, or another model. This gives us flexibility to switch providers or run A/B tests without touching the frontend.

What I Built

  • Architectural Planning: Designed the action system and how it would integrate with our existing codebase.
  • UI Implementation: Worked with our designer to implement the Figma screens, including the inline AI panel and quick action buttons.
  • Action System Backend: Built the action registry, prompt templates, and parameter configurations for each action type.
  • Error Handling & Rate Limiting: Designed the system to gracefully handle API failures, rate limits, and edge cases without breaking the user's flow.
  • Streaming Integration: Implemented the real-time response streaming on both backend and frontend.

Key Results

  • Higher Engagement: Users interact with the AI more frequently now that it's visible and accessible inline.
  • Reduced Context Switching: Administrators no longer leave the platform to use external AI tools for writing help.
  • Foundation for Future AI Features: The action system is reusable. Any UI interaction across the platform that could benefit from AI can now invoke or create a tailored action. This feature became a cornerstone of our company's AI strategy.

The project had a tight deadline, and we shipped it on time. The key was front-loading the planning—defining the action system architecture before writing code—and staying focused on what would actually improve the administrator's workflow. Sometimes the best features aren't about adding complexity, but about removing friction.