Product Hunt reviews are one of the most under-used intent signals in B2B SaaS. Every review is left by someone who actually used a product, voluntarily wrote about it, and — this is the part that matters — attached their real job title and company to the post. That's a lot more than you get from G2 anonymized profiles.

Below: how to pull every review on any PH product page into Excel, CSV, or JSON, then turn the file into a competitor teardown, a decision-maker outreach list, or a public trust badge for your own product.

Why export Product Hunt reviews

Reading reviews one at a time in the browser stops working past about a dozen entries. Push the data into a spreadsheet and you can finally answer the questions that matter on a Monday morning: who is recommending what, what role do they have, and which complaints come up over and over? The exporter is built for these jobs:

  • Direct competitor benchmarking — pull reviews for three or four products in your category and compare average rating, recommend rate, and complaint themes side-by-side.
  • Decision-maker outreach — filter the reviewer headline / role column for Founder, CEO, VP, Head of… and you've instantly got a list of named buyers who already evaluate tools in your space.
  • Feature-request mining — negative and four-star reviews are where the missing-feature signal lives. Aggregate them and your roadmap practically writes itself.
  • Negative-review triage — if it's your product, surface every 1–2 star review with full reviewer context so support can reach out within an hour, not a quarter.
  • Trust-badge generation — export your five-star reviews, drop the best ones into a testimonial wall on your landing page, complete with the reviewer's headline and company.
  • Sales-deck proof — a slide titled "What 87 customers said on Product Hunt" with average rating, recommend rate, and named-buyer quotes lands very differently in an enterprise pitch than a generic "users love us."

How to export — step by step

Step 1 — Copy the Product Hunt product URL

Open the product page you want reviews from — not the launch post, the canonical product page — and copy the URL. It looks like https://www.producthunt.com/products/your-product. The reviews tab on that product is what gets exported. (Easy mistake: Notion has a launch post AND a product page. Reviews live on the product page.)

Step 2 — Paste the URL into the exporter

Go to exportcomments.com/export-producthunt-reviews, paste the URL, pick your format. Excel (XLSX) is the default. Good for filtering, pivoting, and pasting into ChatGPT. CSV is best for CRMs and warehouses. JSON for scripting and automation.

Step 3 — Run the export

Click Export. Most jobs finish in 30–90 seconds depending on review volume. Close the tab if you want; the email and your dashboard My Exports view will both show the result when it's ready.

Step 4 — Download and open the spreadsheet

Each review is one row. Sort by star_rating ascending to triage negative reviews first. Or sort by helpful_votes descending to surface the ones other PH users found most useful — those tend to carry the highest perceived signal.

Step 5 (optional) — Bulk-export competitor product pages

To benchmark against multiple competitors, paste several product URLs into the bulk-upload box, one per line. ExportComments runs each one in parallel and gives you back one file per URL packaged in a single ZIP. Each competitor stays in its own clean spreadsheet, which is exactly what you want for a side-by-side comparison.

Inside the export — what fields you get

Every row gives you the fields you need for analysis, segmentation, and outreach:

  • review_id — unique Product Hunt identifier for the review.
  • reviewer_handle — @username on Product Hunt.
  • reviewer_name — display name shown on the profile.
  • headline — the reviewer's self-written one-liner (e.g. "Head of Growth at Acme").
  • company — parsed company affiliation from headline / profile.
  • role — parsed role / job title (Founder, CEO, VP, PM, Engineer…).
  • star_rating — integer 1–5 stars.
  • recommended — Y/N flag, separate from the star rating (see note below).
  • review_text — full body of the review with line breaks preserved.
  • helpful_votes — number of "helpful" votes the review received from other PH users.
  • created_at — ISO-8601 timestamp of when the review was posted.
  • profile_url — direct link to the reviewer's Product Hunt profile.

Why both star rating and a separate "recommended" flag matter. Product Hunt collects both signals on purpose. The star rating measures perceived product quality on a granular scale. The recommended flag captures the binary "would I tell a friend to use this?" question. The two diverge more often than you'd think — a reviewer might give a power tool four stars and still recommend it, or give a polished consumer app five stars while flagging it as not recommended for their specific use case. Always analyse the two columns together. The average of one without the rate of the other tells you half the story.

Common workflows for SaaS founders

Competitor benchmarking matrix. Bulk-export the review pages of four products in your category. In a fifth tab, pivot each file by star_rating and compute the recommend rate (count of recommended = Y / total). What you get is a four-column matrix that tells you exactly where you sit on quality and on word-of-mouth velocity. I did this once for a client and discovered they were rated higher than the category leader on quality but had a 12-point lower recommend rate. That's a positioning problem, not a product problem — completely different fix.

Decision-maker outreach. Filter role for Founder, Co-founder, CEO, VP, Head of, or Director. The remaining list is decision-makers who already actively evaluate tooling in your space — they wrote a review unprompted. Pair with the company column, enrich emails via your usual stack, and your outbound is far warmer than a cold list could ever be.

Feature-request and complaint mining. Filter star_rating <= 3, paste the review_text column into ChatGPT with: "Cluster these Product Hunt reviews by complaint theme. For each theme, give me a count, three representative quotes, and the named role the complaint came from." You'll get a roadmap input that's both quantitative and quotable. Bonus: when an engineer pushes back on the priority, you've got named CTOs in the file backing it up.

Trust-badge generation. If it's your own product, filter star_rating = 5 AND recommended = Y, sort by helpful_votes descending. The top rows are the best testimonials you've ever written — except you didn't write them. Drop the headline + company + quote into a testimonial component on your landing page. Resend's homepage essentially uses this pattern.

Negative-review triage SLA. Schedule the export to run hourly. Pipe new star_rating <= 2 rows to a Slack channel via webhook so support sees them in real time. Reaching out the same day a negative Product Hunt review goes live is one of the highest-converting save plays in SaaS — the reviewer almost never expects a reply, let alone a fix.

Plan limits and API access

Free returns up to 100 reviews per export, enough to sanity-check the workflow on smaller product pages. Paid tiers scale up: Personal returns 5,000 results per export, Premium 50,000, Business 250,000. The full breakdown — concurrent exports, scheduled-job quotas, webhook delivery limits — is on /pricing.

If you'd rather have Product Hunt review data flowing into your data warehouse, support tooling, or marketing site automatically, the REST API exposes the same exporter as a single call and delivers results via webhook when the job's done. Reference at /api.

FAQ

  • Does this export the reviews from the launch page or from the product page?
    From the dedicated product page (the one with the reviews tab). The launch post and the product page are separate Product Hunt objects — reviews live on the product, comments live on the launch.
  • Can I separate verified-buyer reviews from other reviews?
    Product Hunt does not collect a verified-buyer signal the way Amazon or G2 do, so there is no such column. The reviewer headline + company is the closest thing to a context check you'll get on the platform.
  • How are deleted or moderator-removed reviews handled?
    They are not visible on the page and so are not in the export. You get the public review state at the moment the export runs.
  • Will the export show owner / maker replies to reviews?
    This tutorial covers the review records themselves. If a product page surfaces threaded maker replies they appear as separate rows linked back to the review by parent ID, mirroring how the comments exporter handles reply threads.
  • Can I schedule reviews exports to run automatically?
    Yes. Premium and Business plans support scheduled exports — hourly, daily or weekly — which is the cleanest way to keep a competitor benchmarking dashboard fresh without manual work.
  • What if a product has more reviews than my plan's per-export limit?
    The export returns up to your plan cap, newest-first by default. Upgrade to a higher tier or use the API to paginate; contact support if you're not sure which path fits.