Productivity

requesthunt

resciencelab/opc-skills · updated Apr 8, 2026

$npx skills add https://github.com/resciencelab/opc-skills --skill requesthunt
summary

Collect and analyze real user feedback from Reddit, X, and GitHub to generate demand research reports.

  • Scrapes feature requests, complaints, and questions across three platforms with filtering by topic, category, platform, and time range
  • Includes search with real-time expansion, topic browsing, and sorting by popularity to identify top user demands
  • Provides structured workflow from scope definition through data collection to formatted Markdown report generation
  • Rate-limited API wi
skill.md

RequestHunt Skill

Generate user demand research reports by collecting and analyzing real user feedback from Reddit, X (Twitter), and GitHub.

Prerequisites

Install the CLI and authenticate:

curl -fsSL https://requesthunt.com/cli | sh
requesthunt auth login

The CLI displays a verification code and opens https://requesthunt.com/device — the human must enter the code to approve. Verify with:

requesthunt config show

Expected output contains: resolved_api_key: with a masked key value (not null).

For headless/CI environments, use a manual API key instead:

requesthunt config set-key rh_live_your_key

Get your key from: https://requesthunt.com/dashboard

Output Modes

Default output is TOON (Token-Oriented Object Notation) — structured and token-efficient. Use --json for raw JSON or --human for table/key-value display.

Research Workflow

Step 1: Define Scope

Before collecting data, clarify with the user:

  1. Research Goal: What domain/area to investigate? (e.g., AI coding assistants, project management tools)
  2. Specific Products: Any products/competitors to focus on? (e.g., Cursor, GitHub Copilot)
  3. Platform Preference: Which platforms to prioritize? (reddit, x, github)
  4. Time Range: How recent should the feedback be?
  5. Report Purpose: Product planning / competitive analysis / market research?

Step 2: Collect Data

# 1. Trigger realtime scrape for the topic
requesthunt scrape start "ai-coding-assistant" --platforms reddit,x,github --depth 2

# 2. Search with expansion for more data
requesthunt search "code completion" --expand --limit 50

# 3. List requests filtered by topic
requesthunt list --topic "ai-tools" --limit 100

Step 3: Generate Report

Analyze collected data and generate a structured Markdown report:

# [Topic] User Demand Research Report

## Overview
- Scope: ...
- Data Sources: Reddit (X), X (Y), GitHub (Z)
- Time Range: ...

## Key Findings

### 1. Top Feature Requests
| Rank | Request | Sources | Representative Quote |
|------|---------|---------|---------------------|

### 2. Pain Points Analysis
- **Pain Point A**: ...

### 3. Competitive Comparison (if specified)
| Feature | Product A | Product B | User Expectations |

### 4. Opportunities
- ...

## Methodology
Based on N real user feedbacks collected via RequestHunt...

Commands

Search

requesthunt search "authentication" --limit 20
requesthunt search "oauth" --expand                          # With realtime expansion
requesthunt search "API rate limit" --expand --platforms reddit,x

List

requesthunt list --limit 20                                  # Recent requests
requesthunt list --topic "ai-tools" --limit 10               # By topic
requesthunt list --platforms reddit,github                    # By platform
requesthunt list --category "Developer Tools"                # By category
requesthunt list --sort top --limit 20                       # Top voted

Scrape

requesthunt scrape start "developer-tools" --depth 1         # Default: all platforms
requesthunt scrape start "ai-assistant" --platforms reddit,x,github --depth 2
requesthunt scrape status "job_123"                          # Check job status

Reference

requesthunt topics                                           # List all topics by category
requesthunt usage                                            # View account stats
requesthunt config show                                      # Check auth status

API Info