Public
Like
github-leads
Val Town is a collaborative website to build and scale JavaScript apps.
Deploy APIs, crons, & store data – all from the browser, and deployed in milliseconds.
Viewing readonly version of main branch: v27View latest version
Find potential customers from GitHub stargazer activity. Monitor any repositories for new stars, scrape historic stargazers on first run, and use AI to qualify leads automatically.
- Historic Scraping: On first run, scrapes all existing stargazers from configured repositories
- Ongoing Monitoring: A cron job monitors for new stars on your configured repositories
- AI Lead Qualification: An AI agent researches each stargazer and scores them as a lead
- Dashboard & Alerts: Qualified leads appear in your dashboard with periodic digest emails
- Click Remix
- Add environment variables:
OPENAI_API_KEY— for AI lead qualificationGITHUB_TOKEN— for accessing GitHub API (create one here)
- Configure your repositories:
- Edit
GITHUB_REPOSarray ingithub.cron.ts - Add repositories in "owner/repo" format (e.g., "facebook/react")
- Edit
- Configure email settings:
- Edit
RECIPIENTSindigest.ts - Customize
PROMPT.txtwith your ICP criteria
- Edit
- Open
main.tsto view your dashboard
Note: The first run will scrape ALL historic stargazers, which may take time for popular repositories.
Repository Monitoring (github.cron.ts)
- First Run: Scrapes all existing stargazers from configured repositories
- Subsequent Runs: Monitors for new stars since last check
- Processes stargazers in batches to avoid rate limits
- Tracks repository state to know when historic scraping is complete
AI Agent (agent.ts)
- Researches each GitHub user's profile, repos, and linked sites
- Uses web search to learn about their company and role
- Scores them against your ICP defined in
PROMPT.txt - Returns
{name, match, score, leadTypes, reasoning}
Storage (db.ts)
- Leads Table: Every lead is stored in SQLite with columns:
id— auto-incrementedtimestamp— when first seeninput_data— the GitHub stargazer event(s) that triggered itoutput_data— AI result
- Repository State Table: Tracks scraping progress per repository:
repo_name— repository identifierlast_checked— timestamp of last checkhistoric_scrape_complete— whether initial scrape is done
Dashboard (main.ts)
- Qualified leads (match=true) appear at the top
- Shows score and lead type tags (customer/hire)
- Click a lead to see full details and GitHub activity
Email Digest (digest.ts)
- Sends daily emails (1pm UTC) with new qualified leads
- Edit
RECIPIENTSarray to configure who receives them
- Repositories: Edit the
GITHUB_REPOSarray ingithub.cron.tsto add/remove repositories - Lead Criteria: Edit
PROMPT.txtto define your ideal customer profile - Monitoring Frequency: Adjust the cron schedule in
github.cron.ts(default: hourly) - Email Schedule: Adjust the email digest schedule in
digest.ts(default: daily at 1pm UTC)
Add repositories to monitor in the GITHUB_REPOS array:
export const GITHUB_REPOS = [
"your-org/your-repo",
"facebook/react",
"microsoft/vscode",
"octocat/Hello-World", // Good for testing
// Add more repositories here
];
- Historic Scraping: Repositories with more than 1,000 stars will skip historic scraping to avoid timeouts
- Rate Limits: The system processes stargazers in batches with delays to respect GitHub API limits
- Incremental Updates: After the first run, only new stars are processed, making subsequent runs fast
- Error Handling: Individual lead processing failures won't stop the entire job
- Repository Star Limit: Historic scraping is limited to repositories with ≤1,000 stars
- Batch Processing: Leads are processed in batches of 5 to ensure reliability
- Timeout Protection: Large repositories automatically skip historic scraping
Use the test bar in the dashboard to evaluate any GitHub username instantly.
- "GITHUB_TOKEN is required": Add your GitHub token to environment variables
- Timeouts: Use smaller repositories or the system will automatically skip historic scraping
- Rate Limits: The system includes delays and batching to handle GitHub API limits
- Repository Not Found: Ensure repository names are in "owner/repo" format and are public