top of page

Reading time : 8 mins

Anyline

Rebuilt the WebSDK demo to reduce evaluation friction and help users experience the value of OCR technology within seconds—making the experience fast, easy to understand, and simple to try.

Detailed Casestudy

Domain

B2B, SaaS,

OCR & Data Capture

My Role

Lead UX Designer

Duration

~2 Months

Headquartered

Vienna (Austria) 

Team Structure

Product Owner

Project Manager

Tech Architect

2 React Developers

UX Designer

20 Second Impact Overview

~15s faster Time-to-First-Scan, with even greater gains during repeat scans.

26% drop-off reduction via a clear 3-step flow, instant rescan, clearer CTAs, and real-time feedback.

40% fewer clicks through a streamlined flow.

SUS improved from 52 → 74 with clearer, accessible design.

Lead conversion rate increased from 6% to 10%  by reducing frictions.

01 Business Context

Anyline is a leading European technology company specializing in OCR and AI-driven text recognition. Their WebSDK demo plays a critical role in helping potential customers evaluate OCR accuracy, speed, and integration feasibility before adopting the product.

However, the existing demo experience was not effectively supporting this evaluation moment.

Client connect image

Happy Stakeholders​.png

Business Goals

  1. Showcase the WebSDK's scanning power through an intuitive demo.

  2. Increase user engagement and demo-to-trial conversion.

  3. Reduce friction during first-time and repeat evaluations.

  4. Deliver a consistent, brand-aligned(Anyline) experience.

  5. Build a fully functionalscalable, future-ready interface within tight timelines.

02 Research and behavioral analysis

Stakeholder Discovery

Early stakeholder discussions highlighted a critical expectation: users needed to reach a successful scan quickly and confidently to trust the product and move forward.

This shifted the goal from “improving the UX” to “turn evaluation into a low-risk, high-confidence experience.

Literature Review

Reviewed product documentation, brand guidelines, and scan datasets to understand OCR capabilities and constraints.
Insights highlighted the need for clearer feedback, strong visual guidance, and accessible, responsive interaction patterns.

User Interviews & Behavioral Patterns

Users were asked to explore the WebSDK demo as they would during a real product evaluation, focusing on where they hesitated, what confused them, and whether they trusted the results.

Behavioral Patterns Observed :

Across sessions, a few consistent behaviors emerged:

  • Users hesitated early in the flow

  • Scanning felt slower than it actually was

  • Navigation required unnecessary Cognitive effort

  • Retrying felt costly

Usability Benchmark (SUS)

A baseline SUS test scored the demo at 52, well below the industry benchmark of 68, confirming high friction during evaluation.

Heuristic Evaluation

A heuristic review revealed that the demo violated multiple usability principles.

Key issues included:

  • No clear progress or status during scanning

  • Limited ability to rescan or recover from errors

  • Unclear instructions and file requirements

  • Users forced to rely on memory instead of visible context

  • Visually cluttered and inconsistent screens

User Journey Mapping

A current-state journey map identified where users hesitated, lost confidence, and dropped off during evaluation flow.

It shifted perspective from inside-out product thinking to the outside-in user experience and helped break down silos across Product, Sales, and Design, creating a shared view of evaluation friction and confidence gaps.

User journey map

03 Synthesis to Strategy

Synthesis methods

  • Key Behavioral Themes (Affinity Mapping)

  • Key Challenges Identification

  • Persona

  • User Stories

  • Insights to Strategy

Key Behavioral Themes

Affinity map

Key Challenges Identified

Research was synthesized into 5 core challenges :

Key Challenges

Persona

Persona

Insights to Strategy

Strategy : Turn the WebSDK demo into a confidence-building evaluation experience.

User research showed that early demo moments were dominated by hesitation, unclear feedback, and high cognitive effort.

So the experience was designed around 3 strategic principles :

  • Fast success builds trust

  • Uncertainty kills engagement

  • Exploration should feel safe

Translating Strategy into Interaction

With the structure and visual direction established, the interface was refined around four design principles aimed at making the product easy to try, easy to understand, and safe to explore during evaluation.

Gradient Pastel Background
1.png
1.Accelerate Time to First Value

The interface was designed so users could reach their first successful scan within seconds, reducing friction during evaluation.

  • Sample images enabled instant testing

  • Upload triggers automatic scanning

  • Removed explicit “start scan” actions

  • Minimal setup before first result

2.png
2.Reduce Uncertainty Through Clear System Feedback

To build user confidence, the system continuously communicates what is happening and when results are ready.

  • Visible scan progress and system states

  • Clear success, loading, and error feedback

  • Result confidence and processing time displayed

Soft Pastel Gradient
4.png
3.Encourage Safe Exploration

Evaluation often requires testing multiple inputs and modes. The interface supports experimentation without forcing users to restart the workflow.

  • Instant rescan capability

  • Mode switching without resetting the flow

  • Persistent sample images for quick retries

Consistent.png
4.Reinforce Trust Through Clarity and Consistency

A restrained visual system ensures the interface feels reliable, readable, and technically credible.

  • Consistent layout across screens

  • Clear hierarchy for results and actions

  • Accessible contrast and typography

The System in Action

Show time

These decisions collectively shaped the final experience—one that feels fast, clear, forgiving, and trustworthy during product evaluation.

Web - Prototype

Mobile - Prototype

Usability Validation (SUS)

Post-Redesign Outcome
  • SUS improved from 52 → 74 (+22),
    moving the experience from poor to excellent usability.

  • Confirmed that the redesigned demo felt clearer, faster, and easier to use during evaluation.

Handoff & Collaboration

  • ​Partnered closely with engineering to ensure smooth implementation.

  • Design intent, motion behavior, and accessibility requirements were preserved through structured handoff and regular cross-functional reviews.

05 Outcomes & Learnings

Outcomes

The redesigned WebSDK demo delivered measurable improvements across usability, efficiency, and conversion:

  • 40% fewer clicks per scan, driven by a simplified 3-step flow.

  • ~15s faster Time-to-First-Scan, with even greater gains during repeat scans.

  • 26% drop-off reduction.

  • SUS improved from 52 → 74.

  • Lead conversion increased from 6% → 10%.

  • WCAG 2.1 AA compliant, ensuring accessibility across devices and users

  • Delivered on time with 100% client satisfaction

Key Learnings

  • Clarity matters as much as speed
    Simplifying the flow reduced effort, while clear feedback and recovery reduced hesitation

  • Accessibility isn’t optional
    Designing for WCAG compliance improved clarity and usability for all users, not just edge cases.

  • Prototypes accelerate buy-in
    Letting stakeholders experience the flow firsthand reduced ambiguity and sped up approvals.

  • Design efficiency = business efficiency
    Simplifying workflows reduced user effort while directly improving engagement and conversion.

“The redesigned demo has completely elevated how we present our technology. It’s faster, smarter, and feels smooth."

— Lucie M., Project Manager, Anyline

bottom of page