Dogfooding

A figure in an apron eating from a bowl that contains a laptop, the dogfooding metaphor

In Brief

Dogfooding is an unstructured product research method where team members use their own product in real daily work, just as a customer would. You go through the actual workflows, note where things break or feel awkward, and record ideas for improvement. The output is a firsthand list of bugs, friction points, and unexpected use cases that formal testing often misses.

Common Use Case

Your team has built the first version of your product and you want a quick reality check before putting it in front of customers. You and your teammates use the product in your daily work for a week, noting every moment of friction, confusion, or delight so you can fix the biggest issues before anyone else sees it.

Helps Answer

  • Does the product actually deliver on the value proposition?
  • Is the product working correctly in real use?
  • What is the minimum viable feature set?
  • Where does the workflow break down or feel awkward?
Ongoing. Dogfooding works best as a continuous habit rather than a one-time exercise. The initial structured round typically takes one to two weeks.
Dogfooding requires no additional budget beyond the team’s time. The only cost is the opportunity cost of team members using the product instead of other work. AI note-taking tools can help capture observations without interrupting workflow.

Description

Using one’s own product is standard practice among many technology startups and entrepreneurs. In many cases, entrepreneurs are building a product to solve their own pain points, so it is common to then use the product.

There is no predefined script for dogfooding, and it is not a formal quality assurance (QA) process. The main advantage to this method is that it may reveal unorthodox use cases that were not covered in the requirements or QA tests. Dogfooding should primarily be considered generative research and not an experiment.

AI has minimal impact on dogfooding, and that is precisely the point. You cannot delegate this to an AI agent or automated testing script — the value comes from personally encountering friction, confusion, or delight while trying to accomplish a real task. If your product includes AI features, dogfooding becomes even more critical: you need firsthand experience of how the AI behaves in real scenarios, where it fails, and where it produces unexpected results. Automated QA and AI-powered testing tools are valuable for catching bugs, but they are not dogfooding.

Companies that do not use their own products and services are sometimes criticized, in some cases very publicly.

How to

Prep

  1. Set a clear scope. Pick the workflows and personas you intend to walk through (new-user signup, daily power-user task, admin path) so the team is exercising the same surfaces and you can compare notes afterward.
  2. Set up a low-friction capture channel. Make sure you have a place to take notes that is easily accessible and won’t interrupt your workflow too much — a shared issue tracker, a single Slack channel, or a running doc.
  3. Decide who participates. Pull in teammates outside the build team where possible; engineers and designers carry too much prior knowledge to notice every friction point alone.

Execution

  1. Use the product in your day-to-day work. Resist the urge to switch to a workaround when something annoys you — the workaround is the data point.
  2. Take notes whenever something works surprisingly well or fails to live up to expectations. Record any additional insights or ideas that occur while using the product.
  3. Note any time when the workflow is interrupted or another service is needed to finish the task. These hand-offs are the most common hidden friction.
  4. Capture context, not just bugs. Screenshot the screen, note what you were trying to do, and write down what you expected to happen.

Analysis

  1. Be careful to interpret the results as generative and not evaluative. The makers of a product have an intimate knowledge of the product design and likely cannot capture the uneducated user’s perspective. This is especially true in dogfooding the new-user process, where the makers of the product have a massive amount of prior information and expectations regarding the signup and onboarding.
  2. Watch for missing edge cases, particularly when the team is not diverse. A team may not analyze the product for use by handicapped or minority users and thus overlook substantial aspects of its user experience. This becomes more of a problem as a product scales beyond an initial niche audience.
  3. When multiple team members dogfood their product, collect and sort notes via card sorting, stack ranking, or other standard UX methods to surface the patterns rather than reacting to individual gripes.
Biases & Tips
  • Confirmation bias Creators of a product can subconsciously avoid situations and use cases they know are incomplete or buggy, leaving a positive impression that the product works according to the specification even if it has serious flaws in ordinary usage.

  • Expert blind spot Team members know the product too well to notice friction a first-time user would hit immediately. Recruit teammates outside the build team or pair internal testing with external usability testing.

  • Dogfooding only works when your team is as diverse as your customer base. - @TriKro

Next Steps

  • Triage dogfooding feedback into a prioritized issue backlog.
  • Supplement internal testing with external Usability Testing to cover blind spots.
  • Establish a regular dogfooding cadence (e.g., weekly, sprint-based).
  • Track the percentage of bugs found internally vs. reported by customers as a quality metric.
  • Use Usability Testing with external users to uncover blind spots your team misses due to product familiarity.
  • Run a Net Promoter Score Survey to measure whether real customers share the satisfaction your internal team experiences.
Learn more

Case Studies

Alphabet exec Eric Schmidt uses an iPhone, but thinks the Galaxy S7 is better
Cnet

Google’s Eric Schmidt: Why I Love My BlackBerry

Read more

Dogfooding Examples

Who’s Doing It Right?

Read more

Twitter

10 Massively Successful Minimum Viable Products

Read more

PostHog

How We Do Dogfooding (with Examples): PostHog’s 2024 deep dive into how they dogfood their own analytics platform. Key example: their data warehouse feature was inspired by internal teams needing PostHog to be the source of truth for product and customer data. They also built their in-app survey feature after discovering internal user interview booking was too cumbersome.

Read more

Bubble

Building Their Website on Their Own Platform: Bubble built their website using their own no-code platform, positioning their internal developers to test, iterate, and request new features. This created a “forcing function” that ensured all necessary features existed, with internal developers acting as demanding customers.

Read more

PostHog

The Importance of Dogfooding — Why Product Managers Should Use Their Product: PostHog documents how dogfooding works best when it is intentional and structured rather than ad hoc. Their approach treats internal engineers like customers, gathering ongoing feedback, running interviews, and iterating on usability and performance — similar to how Slack’s infrastructure team used their own developer tools daily.

Read more

Microsoft

Over 200 developers dogfooded daily builds of Windows; when someone’s code broke the build, they felt immediate consequences in their own workflow. The phrase traces to a 1988 internal email from Paul Maritz to Brian Valentine titled “Eating our own Dogfood,” which urged the LAN Manager team to use its own product to push adoption. The practice scaled to a 20,000+ node international network.

Read more

Lyft

Requires corporate employees to periodically work as drivers. This forces leadership to experience the product from the supply side, surfacing driver-facing UX issues that might never reach internal dashboards.

Read more

CrowdStrike

After the July 2024 worldwide outage, SVP Adam Meyers testified before the US Congress that increased internal dogfooding of agent updates was a formal remediation step the company committed to — a rare case of dogfooding adopted as regulatory remediation.

Read more

Frontegg

Uses its own Entitlements Engine to run subscription tiers and permission enforcement on its own product, so every internal subscription change validates the customer-facing engine.

Read more

Further reading

Got something to add? Share with the community.