Skip to main content

The SaaS UX Audit Checklist - 100 Points to Check Today

The same 100-point UX audit checklist our team uses when reviewing SaaS products across onboarding, navigation, dashboards, forms, accessibility, mobile responsiveness, and performance.

Checklist (Interactive) Free - no email required Updated May 2026 100 checklist prompts

Built for practical use

What to look for

The same 100-point UX audit checklist our team uses when reviewing SaaS products across onboarding, navigation, dashboards, forms, accessibility, mobile responsiveness, and performance.

How to evaluate it

The same 100-point UX audit checklist our team uses when reviewing SaaS products across onboarding, navigation, dashboards, forms, accessibility, mobile responsiveness, and performance.

Why it matters

The same 100-point UX audit checklist our team uses when reviewing SaaS products across onboarding, navigation, dashboards, forms, accessibility, mobile responsiveness, and performance.

Priority level guidance

The same 100-point UX audit checklist our team uses when reviewing SaaS products across onboarding, navigation, dashboards, forms, accessibility, mobile responsiveness, and performance.

How to Use This Checklist

This is the exact 100-point checklist our UX team uses when auditing SaaS products. Walk through each item against your product, score it (Pass / Partial / Fail), and note the specific issue. By the end, you'll have a prioritized list of improvements.

Scoring:

  • Pass — Meets the standard with no issues
  • Partial — Somewhat meets the standard but has room for improvement
  • Fail — Does not meet the standard; needs attention

Priority Levels:

  • Critical — Directly impacts revenue, user safety, or legal compliance. Fix immediately.
  • Major — Significantly impacts usability or conversion. Fix within 30 days.
  • Minor — Impacts polish or delight. Fix within 90 days.

Severity Reference (for scoring findings):

  • 0 = Not a problem
  • 1 = Cosmetic issue — fix if time permits
  • 2 = Minor usability issue — low priority fix
  • 3 = Major usability issue — important to fix, high priority
  • 4 = Usability catastrophe — must fix before launch/next release

How to Score Your Audit

After completing all 100 checkpoints:

Calculate your overall score:

  • Count total Passes, Partials, and Fails
  • Score: Pass = 1 point, Partial = 0.5 points, Fail = 0 points
  • Maximum score: 100 points

Interpret your score:

  • 90-100: Excellent UX — minor polish needed
  • 75-89: Good UX — several meaningful improvements available
  • 60-74: Adequate UX — significant issues impacting user experience
  • 40-59: Below average — major usability problems likely impacting business metrics
  • Below 40: Critical — UX is actively driving churn and support costs

Prioritize your fixes:

  1. All Critical-priority Fails → fix immediately (these impact revenue or compliance)
  2. All Major-priority Fails → fix within 30 days
  3. All Critical/Major Partials → improve within 60 days
  4. All Minor-priority Fails and remaining Partials → fix within 90 days

Track improvement:

  • Re-run this audit monthly (or after major releases)
  • Track your score over time to demonstrate UX improvement to stakeholders
  • Pair this audit with real user analytics (session recordings, heatmaps, user interviews) for maximum insight

Run The Interactive Audit

Score each checkpoint as Pass, Partial, or Fail and capture notes while you review the product.

1.1 — Signup Flow

1. Signup form asks only for essential information.

The signup form collects only what's needed to create an account (typically: email, password, and optionally name). Business-qualifying questions (company size, role, use case) are deferred to a secondary step or asked in-product — not during signup. Every additional field reduces signup completion by approximately 10%.

2. Social login / SSO options are available.

Google, Microsoft, GitHub, or Apple sign-in reduces friction for users who don't want to create new credentials. For B2B SaaS, "Sign in with Google Workspace" or "Sign in with Microsoft" are most relevant.

3. Password requirements are clearly communicated before submission.

If you require a minimum length, special characters, or mixed case, these requirements are displayed before the user submits — not as an error message after failed submission. Real-time validation (green checkmarks as requirements are met) is ideal.

4. Email verification flow is fast and clear.

If you require email verification, the verification email arrives within 30 seconds, the CTA in the email is prominent, and the user is redirected to a logged-in state immediately upon clicking. There is a clear "Resend verification email" option visible on the waiting screen.

5. Signup-to-first-screen transition is immediate.

After signup, the user sees a meaningful product screen within 3 seconds — not a loading spinner, not a blank dashboard, not a "your account is being set up" message that takes 30 seconds. First impressions are formed in milliseconds.

1.2 — Onboarding Flow

6. The product communicates its core value within the first 60 seconds.

The user can understand what this product does and why it matters to them within their first minute of interaction. This is often achieved through a welcome screen with a clear value statement, a guided first action, or a pre-populated demo environment.

7. The onboarding guides users toward their "aha moment."

The onboarding flow is designed to get users to the specific action that correlates with long-term retention (the activation event). For a project management tool, this might be "create your first project and invite a teammate." For an analytics tool, "connect your first data source and see your first dashboard."

8. Empty states are designed, not just tolerated.

When a user lands on a page with no data (empty dashboard, empty project list, no messages), the empty state provides clear guidance: what this section is for, why it's empty, and a CTA to take the first action. "No data available" is never acceptable.

9. Onboarding progress is visible.

If onboarding consists of multiple steps, a progress indicator (checklist, progress bar, step counter) shows users where they are, how many steps remain, and what they've already completed. Completionist psychology drives higher completion rates.

10. Users can skip or defer onboarding.

Power users and returning users should be able to skip the onboarding flow entirely. A "Skip for now" or "I'll do this later" option should be available at every step. Onboarding should guide, not trap.

11. Onboarding adapts to user role or use case.

If your product serves multiple personas (e.g., admin vs. team member, marketer vs. developer), the onboarding asks a qualifying question early and adapts the flow accordingly. A developer doesn't need the same onboarding as a marketing manager.

12. Sample/demo data is available for exploration.

For data-heavy products (dashboards, analytics, project tools), a pre-populated demo workspace lets users explore features without setting up their own data first. This dramatically reduces time-to-value for evaluation-phase users.

1.3 — First-Session Experience

13. The most important action on each screen is visually obvious.

On every screen, the primary action the user should take is the most prominent element — through size, color, position, or contrast. Users should never wonder "what am I supposed to do here?"

14. Help is accessible but not intrusive.

Tooltips, contextual help icons, or a help drawer are available for users who need guidance — but they don't block the interface or auto-play on every visit. Help should be opt-in, not forced.

15. The first session ends with clear next steps.

When a user finishes their first session (or closes the browser), they should know what to do next time they return. This might be a checklist of remaining setup steps, a "welcome back" email with suggestions, or an in-product banner on their next visit.

1.4 — Activation Metrics

16. The product tracks activation events.

There is a defined activation metric (e.g., "user invited a teammate," "user created first report") and analytics are configured to measure it. You cannot improve what you don't measure.

17. Time-to-value is measured and optimized.

The time from signup to first meaningful value delivery is tracked. Industry benchmark: under 5 minutes for simple products, under 30 minutes for complex products. Every minute of delay loses users.

18. Drop-off points in onboarding are identified.

Analytics show exactly where users abandon the onboarding flow. Common drop-off points include: data import steps, integration setup, team invite, and payment wall. Each drop-off point has a hypothesis for why and a plan to improve.

19. Re-engagement for incomplete onboarding exists.

Users who started onboarding but didn't complete it receive a follow-up (email, in-product prompt, or push notification) within 24 hours with a direct link to resume where they left off.

20. The trial experience provides enough access to demonstrate value.

Trial limitations (feature restrictions, time limits, usage caps) don't prevent users from experiencing the product's core value. If your trial is so restricted that users can't reach the "aha moment," your trial is hurting conversion, not helping it.

CATEGORY 2: NAVIGATION & INFORMATION ARCHITECTURE (15 Points)

21. Primary navigation is visible and consistent across all pages.

The main navigation menu is in the same position on every page, uses the same labels, and is always accessible (not hidden behind a hamburger menu on desktop). Users should never lose their sense of "where am I?"

22. Navigation labels use user language, not internal jargon.

Menu items are labeled with words users would use, not internal product team terminology. "Reports" not "Analytics Engine." "Team" not "User Management." "Settings" not "Configuration." Card sorting or tree testing can validate label choices.

23. The current location is clearly indicated.

The active navigation item is visually highlighted (color, underline, bold, background). Breadcrumbs show the hierarchical path. The page title matches the navigation label. Users always know where they are.

24. Navigation depth is 3 levels or fewer.

Users can reach any feature or page within 3 clicks or taps from the main dashboard. Deep navigation hierarchies (4+ levels) cause disorientation and frustration. If navigation is deeper than 3 levels, the information architecture needs restructuring.

25. Search is available and functional.

A global search feature is accessible from every page (typically in the header). Search returns relevant results quickly, handles typos and partial matches, and covers all major content types (pages, settings, records, help articles).

26. Frequently used actions are easily accessible.

The 5-10 most common user actions (create new item, view dashboard, access settings, get help) are reachable within 1-2 clicks from any screen. Keyboard shortcuts are available for power users.

27. Back navigation works as expected.

The browser back button returns users to the previous screen without data loss, broken states, or unexpected behavior. "Back" should always mean "undo my last navigation," never "start over" or "lose my work."

28. Mobile navigation is usable.

On mobile devices, the navigation is accessible (hamburger menu, bottom tabs, or similar pattern), touch targets are at least 44×44px (per WCAG), and the most important actions are reachable without extensive scrolling.

29. Secondary/utility navigation is separated from primary navigation.

Settings, help, account, billing, and profile links are in a secondary location (top-right dropdown, sidebar footer, or gear icon) — not mixed into the primary product navigation.

30. Contextual navigation exists within features.

Complex features with multiple sub-views (e.g., a project with tasks, files, discussions, settings) have local navigation (tabs, sub-navigation) within the feature — not requiring users to go back to the main menu and re-enter.

31. Navigation works with keyboard only.

All navigation items can be accessed and activated using only the keyboard (Tab, Enter, Escape, Arrow keys). Focus order follows visual order. Focus indicators are clearly visible. This is both an accessibility requirement (WCAG 2.1.1) and a power-user feature.

32. Empty navigation states are handled.

If a navigation category has no content (e.g., "Reports" with no reports created), clicking it shows an empty state with guidance — not a blank page or an error.

33. Navigation doesn't shift or change unexpectedly.

Navigation items don't reorder, appear, or disappear based on context in a way that confuses users. If items are permission-dependent (admin vs. user), the structure remains consistent — items are either always visible or always hidden, not dynamically shuffling.

34. External links are clearly distinguished.

Links that open in a new tab or navigate away from the product are marked with an icon (external link icon) or visual indicator so users know they're leaving the current context.

35. Error pages (404, 500) provide navigation back to safety.

Custom error pages include: a clear explanation of what happened, a link to the homepage or dashboard, a search bar, and contact support information. The default server error page is never shown.

CATEGORY 3: DASHBOARD & DATA DISPLAY (15 Points)

36. The dashboard surfaces the most important information first.

The dashboard's visual hierarchy prioritizes the metrics and data that matter most to the user's primary job. Not everything deserves equal visual weight. The most critical numbers should be the largest, highest-contrast, and most prominent elements.

37. Data has context.

Numbers are shown with comparison context: period-over-period (vs. last week/month), trend direction (up/down arrow), goal progress (80% of target), or benchmark (above/below average). A number without context is meaningless.

38. Charts and visualizations are appropriate for the data type.

Line charts for trends over time. Bar charts for comparisons between categories. Tables for detailed records. Metric cards for single KPIs. Pie charts are used sparingly (and only for parts-of-a-whole with 5 or fewer segments). The wrong chart type misrepresents data and confuses users.

39. Dashboards are scannable in under 10 seconds.

A user should be able to glance at the dashboard and understand the overall status (good, bad, needs attention) within 10 seconds. If they need to read labels, cross-reference numbers, or scroll extensively to understand status, the dashboard has failed its primary job.

40. Filters and date ranges are intuitive and persistent.

If the dashboard allows filtering (by date, team, project, status), the filter controls are easily discoverable, the active filters are clearly displayed, and filter selections persist across page navigations within the session.

41. Data loading states are informative.

While data loads, skeleton screens or shimmer placeholders show the structure of what's coming — not a blank page or a generic spinner. Users should know the page is loading content, not broken.

42. Empty data states provide guidance.

When a chart or table has no data (new account, filtered result with zero matches), the empty state explains why and what the user can do about it. "No data to display" is insufficient. "No data for this date range. Try expanding the date range or check your data source connection" is better.

43. Tables are sortable and searchable.

Data tables allow sorting by any column header (ascending/descending) and include a search or filter mechanism for finding specific records. Pagination or infinite scroll is implemented for large datasets (100+ rows).

44. Large numbers are human-readable.

Numbers are formatted with thousands separators (1,234,567 not 1234567), abbreviated when appropriate (1.2M not 1,234,567), and use appropriate precision (2 decimal places for currency, 0 for counts, 1 for percentages).

45. Color is not the only indicator of meaning.

If red means "bad" and green means "good," there must also be a text label, icon, or pattern that communicates the same meaning for color-blind users (approximately 8% of males, 0.5% of females). This is a WCAG 1.4.1 requirement.

46. Data can be exported.

Users can export dashboard data or table data in a usable format (CSV, Excel, PDF). Export functionality includes the current filter/date range selection. Export buttons are discoverable but not prominent enough to cause accidental clicks.

47. Real-time data indicates freshness.

If data updates in real-time, a "Last updated: 2 min ago" timestamp is visible. If data is delayed (batch-processed), the delay is communicated. Users should never wonder whether they're looking at current or stale data.

48. Drill-down is available for summary metrics.

Summary numbers on the dashboard are clickable, leading to a detailed view that shows the underlying records. A metric that says "42 new users this week" should link to a list of those 42 users. Summary without drill-down is a dead end.

49. Dashboard is responsive across devices.

The dashboard layout adapts to tablet and mobile screens. Charts resize, tables scroll horizontally, and metric cards stack vertically. The mobile dashboard may show fewer data points but should still communicate the overall status.

50. Dashboard customization is available (optional but ideal).

Users can rearrange, add, remove, or resize dashboard widgets to personalize their view. At minimum, a choice between 2-3 preset layouts (executive overview, operational detail, custom) adds significant value.

CATEGORY 4: FORMS & INPUT (12 Points)

51. Form labels are always visible.

Every input field has a visible label above or beside it — not just placeholder text inside the field. Placeholder text disappears when the user starts typing, leaving them without context. Labels should remain visible at all times (the floating label pattern is acceptable).

52. Required fields are clearly marked.

Required fields are indicated with an asterisk (*) and/or the word "Required." Optional fields may be marked as "Optional." The convention should be consistent throughout the product.

53. Inline validation provides immediate feedback.

Form fields validate in real-time (or on field blur) — not only on form submission. Email format, password strength, required field completion, and character limits are validated as the user types. Error messages appear immediately next to the relevant field.

54. Error messages are specific and actionable.

Error messages explain what went wrong and what the user should do to fix it. "Invalid input" is unacceptable. "Email address must include an @ symbol" or "Password must be at least 8 characters" are specific and actionable.

55. Success confirmation is provided after form submission.

When a form is submitted successfully, clear confirmation is shown (success message, green checkmark, toast notification, or redirect to a confirmation page). The user should never wonder "did it work?"

56. Long forms are broken into logical steps.

Forms with more than 6-8 fields are divided into steps or sections with a progress indicator. Each step groups related fields (e.g., Step 1: Account Info, Step 2: Company Info, Step 3: Preferences). Users can navigate back to previous steps without losing data.

57. Auto-fill and auto-complete work correctly.

Browser auto-fill (name, email, address, credit card) is supported through correct HTML input types and autocomplete attributes. Search fields offer auto-complete suggestions based on available options.

58. Destructive form actions require confirmation.

Actions that delete data, cancel subscriptions, or make irreversible changes require a confirmation step ("Are you sure you want to delete this project? This action cannot be undone.") with a clear explanation of consequences.

59. File upload fields show progress and accepted formats.

File upload fields clearly state accepted formats (e.g., "PDF, DOCX, or PNG, max 10MB") before upload, show upload progress (percentage or progress bar), and provide clear error messages for invalid files (wrong format, too large).

60. Form data is preserved on error.

If a form submission fails (validation error, server error, network error), all previously entered data is preserved. The user should never have to re-enter information they already typed.

61. Date and time inputs use appropriate components.

Date fields use date pickers (not free-text input), time fields use time pickers, and date ranges use range selectors. The format matches user locale expectations (MM/DD/YYYY for US, DD/MM/YYYY for most of the world).

62. Form field tab order is logical.

Pressing Tab moves focus through form fields in the visual order they appear. Focus doesn't jump to unexpected fields, skip fields, or get trapped in a field.

CATEGORY 5: FEEDBACK & ERROR HANDLING (10 Points)

63. System status is always visible.

The user can always tell what the system is doing: loading (spinner/skeleton), processing (progress bar), waiting for user input (cursor in field), or complete (confirmation). Nielsen's first heuristic: visibility of system status.

64. Success actions provide confirmation.

Every user action that changes system state (save, delete, send, create, update) provides feedback confirming the action was successful: toast notification, inline confirmation, page redirect with success message, or visual state change.

65. Error messages are human-readable.

No error codes, stack traces, or technical jargon in user-facing error messages. "Something went wrong. Please try again or contact support at hello@desisle.com" is better than "Error 500: Internal Server Error" or "Uncaught TypeError: Cannot read property 'map' of undefined."

66. Errors indicate the location of the problem.

For form errors, the specific field with the error is highlighted (red border, error icon) and the error message appears next to that field — not only at the top or bottom of the form. For system errors, the affected area of the UI is indicated.

67. Recovery from errors is easy.

When an error occurs, the user can recover without losing work. "Undo" is available for accidental deletions. "Retry" is available for failed network requests. "Back" returns to the previous valid state.

68. Destructive actions are reversible when possible.

Delete actions use a soft-delete pattern (moved to trash, recoverable for 30 days) rather than hard delete when feasible. If hard delete is necessary, a confirmation dialog explains the consequences explicitly.

69. Loading states are distinguished from errors.

Users can clearly distinguish between "still loading" (content is coming) and "nothing here" (no content exists) and "something broke" (an error occurred). These three states require different visual treatments.

70. Timeout and session expiry are handled gracefully.

If a session expires, the user receives a clear notification and is redirected to login — without losing unsaved work (autosave should preserve their state). The session timeout duration is communicated in advance if possible.

71. Offline states are handled (for applicable products).

For products used on mobile or in unreliable network conditions, offline states are communicated clearly, and offline functionality (if any) is explained. "You're offline. Changes will sync when you reconnect" is far better than a silent failure.

72. Rate limiting and system overload are communicated.

If the system is rate-limiting the user or experiencing high load, a user-friendly message explains the situation and sets expectations. "We're experiencing high traffic. Your request is queued and will be processed shortly" is better than a generic error.

CATEGORY 6: ACCESSIBILITY (12 Points)

73. Color contrast meets WCAG AA standards.

Text on backgrounds has a contrast ratio of at least 4.5:1 for normal text (under 18px or 14px bold) and 3:1 for large text (18px+ or 14px+ bold). Test with WebAIM Contrast Checker, Figma Stark plugin, or Chrome DevTools accessibility audit.

74. Interactive elements are keyboard-accessible.

All buttons, links, form fields, dropdowns, modals, and interactive components can be reached and activated using only the keyboard (Tab to navigate, Enter/Space to activate, Escape to close). No functionality is mouse-only.

75. Focus indicators are visible.

When a user navigates with the keyboard, the currently focused element has a clearly visible focus ring (border, outline, or background change). Default browser focus rings should not be removed via CSS `outline: none` without providing a custom alternative.

76. Images have descriptive alt text.

All meaningful images have alt attributes that describe their content and purpose. Decorative images have empty alt attributes (alt=""). Charts and graphs have text descriptions or data table alternatives.

77. Headings follow a logical hierarchy.

The page has one H1, followed by H2s for main sections, H3s for subsections, and so on. Heading levels are not skipped (no H1 → H3 without an H2). Headings describe the content of their section. Screen readers use headings for navigation.

78. Link text is descriptive.

Links say where they go, not just "click here" or "learn more." "View pricing details" is accessible; "click here" is not. Screen reader users often navigate by scanning a list of links — each link must make sense out of context.

79. Form error messages are programmatically associated with fields.

Error messages are connected to their respective form fields using aria-describedby or aria-errormessage attributes so screen readers announce the error when the field receives focus.

80. Modals and dialogs trap focus appropriately.

When a modal opens, keyboard focus moves inside the modal. Tab cycles through modal content only (not behind the modal). Escape closes the modal. When the modal closes, focus returns to the element that triggered it.

81. ARIA roles and labels are used correctly.

Custom interactive components (tabs, accordions, dropdowns, sliders) use appropriate ARIA roles (role="tablist", role="dialog") and labels (aria-label, aria-labelledby) so screen readers can interpret them.

82. Text can be resized to 200% without loss of content or functionality.

When the user zooms to 200% using browser zoom, all content remains readable, no text is clipped or hidden, and all functionality remains accessible. Content reflows rather than requiring horizontal scrolling (WCAG 1.4.10 Reflow).

83. Motion and animation can be disabled.

Animations, transitions, and auto-playing content respect the user's prefers-reduced-motion operating system setting. Users who experience motion sickness or vestibular disorders should not be subjected to excessive animation.

84. Touch targets are at least 44×44 CSS pixels.

On mobile and touch devices, all interactive elements (buttons, links, form controls) have a minimum tap target size of 44×44px to prevent misclicks, especially for users with motor impairments. WCAG 2.5.5 (AAA) recommends 44px; WCAG 2.2 requires minimum 24×24px at AA.

CATEGORY 7: MOBILE RESPONSIVENESS (8 Points)

85. The product is usable on a 375px-wide screen.

All primary functionality works on a screen width of 375px (iPhone SE / standard mobile). Content reflows, navigation is accessible, and no horizontal scrolling is required for standard content.

86. Touch targets are appropriately sized and spaced.

Buttons, links, and interactive elements have at least 44×44px tap targets with at least 8px spacing between adjacent targets. Crowded interfaces cause misclicks and frustration.

87. Text is readable without zooming.

Body text is at least 16px on mobile. Headings are proportionally larger. Line height is at least 1.5x font size. Users should not need to pinch-zoom to read standard content.

88. Tables and data-heavy views adapt to mobile.

Large data tables either scroll horizontally with a fixed first column, use card-based layouts on mobile, or offer a simplified mobile view. Tables should not break the page layout or require zooming out.

89. Forms are optimized for mobile input.

Input types trigger appropriate keyboards (type="email" for email, type="tel" for phone, type="number" for numbers). Labels are above fields (not beside). Auto-correct and auto-capitalize are set appropriately.

90. Navigation is accessible on mobile.

The primary navigation is accessible via a hamburger menu, bottom tab bar, or similar mobile-native pattern. The current page is clearly indicated. Navigation doesn't require complex gestures.

91. Modals and overlays work on mobile.

Modals are full-screen or near-full-screen on mobile (not tiny centered boxes), can be scrolled if content is long, and can be dismissed easily (close button, tap outside, or swipe down).

92. Performance is acceptable on mobile networks.

Pages load within 3 seconds on a 4G connection. Images are optimized (WebP format, lazy-loaded, appropriately sized). JavaScript bundles are minimized. Lighthouse Performance score is 80+ on mobile.

CATEGORY 8: PERFORMANCE & TECHNICAL UX (8 Points)

93. Page load time is under 3 seconds.

Measured by Largest Contentful Paint (LCP) on Core Web Vitals. Ideal: under 2.5 seconds. Acceptable: under 3 seconds. Above 3 seconds: users start abandoning. Test with Google PageSpeed Insights or Chrome Lighthouse.

94. Interactions are responsive (under 200ms).

Clicks, taps, and keyboard inputs produce visual feedback within 200ms. Measured by Interaction to Next Paint (INP) on Core Web Vitals. Ideal: under 200ms. Sluggish interactions make products feel "broken" even when they work correctly.

95. Visual stability is maintained (no layout shift).

Page elements don't jump or shift as content loads. Measured by Cumulative Layout Shift (CLS) on Core Web Vitals. Ideal: under 0.1. Common causes: images without width/height attributes, late-loading ads/banners, dynamic content insertion above the viewport.

96. The product works across major browsers.

Tested and functional on Chrome, Firefox, Safari, and Edge (latest versions). Critical features also work on mobile Safari (iOS) and Chrome (Android). Cross-browser issues are the #1 source of "it works on my machine" bugs.

97. Links and buttons actually work.

There are no broken links (404 errors), dead buttons (clicks that do nothing), or incomplete features (buttons that say "Coming Soon" without explanation). Every visible element does what it appears to do. Test with tools like Screaming Frog, Broken Link Checker, or manual QA.

98. The product handles long text and edge cases.

User-generated content (names, titles, descriptions) that exceeds expected lengths is handled gracefully: truncated with ellipsis, wrapped properly, or displayed in expandable containers. Content doesn't overflow containers, break layouts, or overlap other elements.

99. Favicons and meta tags are properly configured.

The browser tab shows a recognizable favicon and a meaningful page title (not "Untitled" or "Dashboard | undefined"). Open Graph and Twitter Card meta tags produce good previews when links are shared on social media or in Slack.

100. The product maintains state across sessions.

When a user closes the browser and returns, they should see the same view they left (same page, same filters, same scroll position where feasible). Session state should persist through page refreshes. "Starting over every time" is a major friction source.

What This Checklist Doesn't Cover

This checklist covers heuristic usability evaluation — issues identifiable by expert review. It does NOT replace:

  • User testing: Observing real users interact with your product reveals problems no checklist can find
  • Analytics review: Quantitative data (drop-off rates, time-on-task, feature adoption) shows WHERE users struggle
  • Accessibility audit: This checklist includes 12 accessibility points but a full WCAG 2.1 AA audit requires 50+ criteria (see Resource 16: Web Accessibility Checklist)
  • Content audit: This checklist doesn't evaluate the quality of your copy, help documentation, or error message clarity in detail
  • Performance audit: This checklist includes 8 performance points but a full technical audit covers far more (server configuration, CDN, caching, code optimization)

For a professional UX audit that combines all of the above, contact Desisle at hello@desisle.com.

Sources and Standards Referenced:

  • Jakob Nielsen's 10 Usability Heuristics (Nielsen Norman Group, 1994, updated 2024)
  • WCAG 2.1 Level AA Success Criteria (W3C, 2018)
  • Google Core Web Vitals thresholds (Google, 2024)
  • Material Design Guidelines (Google, 2024)
  • Apple Human Interface Guidelines (Apple, 2024)
  • Baymard Institute Checkout Usability Research
  • NNGroup Research on onboarding patterns, form design, and dashboard usability

Created by Desisle — SaaS UI/UX Design Agency desisle.com | hello@desisle.com Free to use and share with attribution.

Keep Building With These Next

Playbook

SaaS Onboarding UX Playbook - The First 60 Seconds That Matter Most

A longer playbook for designing onboarding that activates users instead of only welcoming them. It covers first-time psychology, aha moments, progressive onboarding, empty states, anti-patterns, and measurement.

Open Onboarding Playbook
Worksheet

UX Heuristic Evaluation Template - Nielsen's 10 Heuristics Applied to SaaS

A ready-to-use spreadsheet for heuristic evaluation with Nielsen's heuristics, severity levels, a findings log, and a priority matrix built in.

Open Heuristic Worksheet
Template

User Journey Mapping Template - From First Touch to Renewal

A visual template for mapping discovery, signup, onboarding, support, renewal, and referral moments with emotion tracking and friction spotting.

Open Journey Map

Need This Applied to Your Product? We'll Turn It Into Execution.

These resource pages are meant to be used hands-on. If you want the audit, plan, or framework translated into live product work, we can do that with your team.