top of page

Reading time : 8 mins

Anyline

Rebuilt the WebSDK demo to reduce evaluation friction and help users experience the value of OCR technology within seconds—making the experience fast, easy to understand, and simple to try.

Domain

B2B, SaaS,

OCR & Data Capture

My Role

Lead UX Designer

Duration

~2 Months

Client

Anyline (Austria) 

Team Structure

Product Manager

Project Manager

Tech Architect

2 React Developers

UX Designer

20 Second Impact Overview

26% drop-off reduction via a clear 3-step flow, instant rescan, clearer CTAs, and real-time feedback.

~15s faster Time-to-First-Scan, with even greater gains during repeat scans.

40% fewer clicks through a streamlined flow.

SUS improved from 52 → 74 with clearer, accessible design.

Lead conversion rate increased from 6% to 10%  by reducing frictions.

01 Business Context

Anyline is a leading European technology company specializing in OCR and AI-driven text recognition. Their WebSDK demo plays a critical role in helping potential customers evaluate OCR accuracy, speed, and integration feasibility before adopting the product.

However, the existing demo experience was not effectively supporting this evaluation moment.

Client connect image

Happy Stakeholders​.png

Business Goals

  1. Showcase the WebSDK's scanning power through an intuitive demo.

  2. Increase user engagement and demo-to-trial conversion.

  3. Reduce friction during first-time and repeat evaluations.

  4. Deliver a consistent, brand-aligned(Anyline) experience.

  5. Build a fully functionalscalable, future-ready interface within tight timelines.

02 Research and behavioral analysis

Stakeholder Discovery

Stakeholders made it clear that the WebSDK demo was a business-critical evaluation tool, not just a UI.
If users couldn’t reach a successful scan quickly and confidently, they wouldn’t trust the product enough to move forward.

This shifted the goal from “improving the UX” to “turn evaluation into a low-risk, high-confidence experience.

Deep dive

Literature Review

Deep dive

User Interviews

Users were asked to explore the WebSDK demo as they would during a real product evaluation, focusing on where they hesitated, what confused them, and whether they trusted the results.

Behavioral Patterns Observed :

Across sessions, a few consistent behaviors emerged:

  • Users hesitated early in the flow

  • Scanning felt slower than it actually was

  • Navigation required unnecessary Cognitive effort

  • Retrying felt costly

Deep dive

Usability Benchmark (SUS)

A baseline SUS test scored the demo at 52, well below the industry benchmark of ~68, confirming high friction during evaluation.

Deep dive

Heuristic Evaluation

A heuristic review revealed that the demo violated multiple usability principles.

Key issues included:

  • No clear progress or status during scanning

  • Limited ability to rescan or recover from errors

  • Unclear instructions and file requirements

  • Users forced to rely on memory instead of visible context

  • Visually cluttered and inconsistent screens

Deep dive

Competitive Research

Deep dive

User Journey Mapping

A current-state journey map was used to identify where users hesitated, lost confidence, and dropped off during the demo flow.

User journey map
Deep dive

03 From Synthesis to Strategy

Synthesis methods

  • Key Behavioral Themes (Affinity Mapping)

  • Key Challenges Identification

  • Persona

  • User Stories

  • Insights to Strategy

Key Behavioral Themes

Affinity map

Key Challenges Identified

Research was synthesized into 5 core challenges :

Key Challenges

Persona

Persona

Insights to Strategy

Strategy : Turn the WebSDK demo into a confidence-building evaluation experience.

User research showed that early demo moments were dominated by hesitation, unclear feedback, and high cognitive effort.

So the experience was designed around 3 strategic principles :

  • Fast success builds trust

  • Uncertainty kills engagement

  • Exploration should feel safe

Low fidelity screens
Moodboard
User Flow

Strategy 1

Accelerate Time-to-First-Value

Addresses: Slow Time-to-First-Value

Design Decisions

  • Simplified the journey into a clear 3-step flow

  • Enabled instant upload → auto-scan

  • Reduced upfront decisions

  • Eliminated explicit “submit” or “start scan” actions

  • Added sample/demo images for users without files

Strategy 1

Strategy 2

Make System State Always Visible

Addresses: Low Confidence During Evaluation

Design Decisions

  • Real-time progress indicators

  • Clear loading, success, and error states

  • Contextual microcopy explaining what’s happening

Strategy 2

Strategy 3

Reduce Cognitive Effort

Addresses: Too Much Effort for a Simple Scan

Design Decisions

  • Removed redundant screens and unnecessary confirmations

  • Removing silent moments

  • Simplified mode selection and grouping related options

  • Clarified primary and secondary actions through visual hierarchy
  • Replaced technical jargon with human-readable language

Strategy 3

Strategy 4

Safe Retry & Exploration

Addresses: Limited Room for Exploration

Design Decisions

  • Enabled easy rescan without restarting the flow

  • Allowed mode switching from within the results view

Strategy 5

Visual Consistency & Accessibility

Addresses: Product Felt Less Polished

Design Decisions

  • Consistent layouts across screen sizes

  • Reduced visual clutter to improve focus

  • WCAG-compliant contrast and typography

  • Used subtle animations to reinforce system feedback

Strategy 4

The System in Action

Show time

These decisions collectively shaped the final experience—one that feels fast, clear, forgiving, and trustworthy during product evaluation.

The following high-fidelity prototypes demonstrate how these decisions come together in a cohesive, end-to-end scanning experience.

Web - Prototype

Mobile - Prototype

Usability Validation (SUS)

Post-Redesign Outcome
  • SUS improved from 52 → 74 (+22),
    moving the experience from poor to excellent usability.

  • Confirmed that the redesigned demo felt clearer, faster, and easier to use during evaluation.

Handoff & Collaboration

  • ​Partnered closely with engineering to ensure smooth implementation.

  • Design intent, motion behavior, and accessibility requirements were preserved through structured handoff and regular cross-functional reviews.

05 Outcomes & Learnings

Outcomes

The redesigned WebSDK demo delivered measurable improvements across usability, efficiency, and conversion:

  • 40% fewer clicks per scan, driven by a simplified 3-step flow.

  • ~15s faster Time-to-First-Scan, with even greater gains during repeat scans.

  • 26% drop-off reduction.

  • SUS improved from 52 → 74.

  • Lead conversion increased from 6% → 10%.

  • WCAG 2.1 AAA compliant, ensuring accessibility across devices and users

  • Delivered on time with 100% client satisfaction

Key Learnings

  • Clarity matters as much as speed
    Simplifying the flow reduced effort, while clear feedback and recovery reduced hesitation

  • Accessibility isn’t optional
    Designing for WCAG compliance improved clarity and usability for all users, not just edge cases.

  • Prototypes accelerate buy-in
    Letting stakeholders experience the flow firsthand reduced ambiguity and sped up approvals.

  • Design efficiency = business efficiency
    Simplifying workflows reduced user effort while directly improving engagement and conversion.

“The redesigned demo has completely elevated how we present our technology. It’s faster, smarter, and feels smooth."

— Lucie M., Project Manager, Anyline

bottom of page