Compare commits

...

29 Commits

Author SHA1 Message Date
Boris Cherny
c40c658e1f Add GitHub workflow logging for issue closure events
Extends the existing log-issue-events workflow to capture detailed metrics when issues are closed, including:
- Who closed the issue
- Whether it was closed automatically (by a bot)
- Whether it was closed as a duplicate
- Number of comments and reactions at closing time
- Issue state reason and timestamp

This provides comprehensive analytics for issue lifecycle tracking alongside the existing issue creation logging.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-11 13:13:52 -07:00
Ashwin Bhat
e05411140d Remove timeout-minutes from lock-closed-issues workflow (#5455)
The 10-minute timeout is unnecessary for this workflow as it typically
completes quickly. Removing it allows the workflow to use the default
GitHub Actions timeout (6 hours for public repos, 72 hours for private),
providing more flexibility if the workflow needs to process a large
number of issues.

Also fixed trailing whitespace inconsistencies in the script.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-08-09 10:58:52 -07:00
Boris Cherny
ce5b9164fa Merge pull request #5429 from anthropics/boris/bzxy
Add Statsig event logging to GitHub issue workflows
2025-08-08 18:46:52 -07:00
Boris Cherny
5af0b38a92 Add Statsig event logging to GitHub issue workflows
- Log events when issues are closed as duplicates in auto-close script
- Log events when duplicate comments are added via dedupe workflow
- Log events when new issues are created
- Follow existing pattern from code review reactions workflow

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-08 18:26:48 -07:00
Boris Cherny
22946869b2 Merge pull request #5423 from anthropics/boris/uqbo
Rename auto-close duplicate issues workflow to remove "dry run"
2025-08-08 17:28:36 -07:00
Boris Cherny
1579216fc7 Enable auto-close duplicate issues workflow
Remove DRY RUN from workflow name and description to activate the automatic closing of duplicate issues after the testing period.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-08 14:03:43 -07:00
Ashwin Bhat
f0042afb3b Replace lock-threads action with GitHub Script (#5330)
Replaces the dessant/lock-threads action with a direct GitHub Script
implementation to avoid the deprecated search/issues API endpoint warning.
The new implementation:
- Uses github.rest.issues.listForRepo() instead of the deprecated search API
- Maintains the same 7-day inactivity threshold
- Adds the same comment before locking
- Uses 'resolved' as the lock reason
- Handles pagination properly for large repositories

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-08-08 13:02:20 -07:00
Boris Cherny
6059607354 Merge pull request #5420 from anthropics/boris/cmyy
Fix workflow failure by adding workflow_dispatch trigger
2025-08-08 12:57:16 -07:00
Boris Cherny
478f63be73 Fix workflow failure by adding workflow_dispatch trigger
The backfill-duplicate-comments script was failing because it tried to trigger
claude-dedupe-issues.yml via workflow_dispatch, but that workflow only had an
issues trigger. Added workflow_dispatch with issue_number input and updated the
prompt to use either event or input issue number.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-08 12:55:48 -07:00
Boris Cherny
a7edbdc9e7 Merge pull request #5414 from anthropics/boris/aier
Add workflow to backfill duplicate comments
2025-08-08 12:13:19 -07:00
Boris Cherny
399a7dcf2f Add workflow to backfill duplicate comments
🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-08 12:11:08 -07:00
Boris Cherny
9ced2a3470 Merge pull request #5413 from anthropics/boris/bktr
Add script to backfill duplicate comments for old issues
2025-08-08 12:08:17 -07:00
Boris Cherny
d7aa61e6f1 Add script to backfill duplicate comments for old issues
Creates a script that identifies old issues without duplicate detection comments and triggers the existing claude-dedupe-issues workflow for each one. This helps ensure historical issues get proper duplicate detection coverage.

Features:
- Scans issues from configurable time period (default 30 days)
- Skips issues that already have duplicate detection comments
- Triggers existing workflow instead of duplicating logic
- Includes dry-run mode and rate limiting for safety

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-08 12:06:36 -07:00
Boris Cherny
072856dd5b Merge pull request #5411 from anthropics/boris/gity
Update auto-close-duplicates script to actually close issues
2025-08-08 11:54:07 -07:00
Boris Cherny
1cc90e9b78 Update auto-close-duplicates script to actually close issues
- Remove dry run mode and implement actual issue closing
- Extract duplicate issue number from bot comments
- Close issues via GitHub API with proper state and comments
- Add error handling for API failures
- Use Claude Code comment format with reopening instructions

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-08 11:52:34 -07:00
Boris Cherny
483e0e892f Merge pull request #5409 from anthropics/boris/cuxg
Improve auto-close duplicates script pagination and filtering
2025-08-08 11:31:03 -07:00
GitHub Actions
5248fa06bc chore: Update CHANGELOG.md 2025-08-08 18:22:13 +00:00
Boris Cherny
eb48a37a48 Improve auto-close duplicates script pagination and filtering
- Add pagination to fetch more than 100 issues (up to 20 pages/2000 issues)
- Filter to only process issues created more than 3 days ago
- Add created_at field to GitHubIssue interface

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-08 11:21:27 -07:00
Boris Cherny
fdb7efe6f1 Merge pull request #5358 from anthropics/boris/eyda
Extract auto-close duplicates script to TypeScript with Bun
2025-08-08 11:13:27 -07:00
Boris Cherny
7060992aa5 Update .github/workflows/auto-close-duplicates.yml
Co-authored-by: Ashwin Bhat <ashwin@anthropic.com>
2025-08-08 11:13:05 -07:00
Boris Cherny
b7116c78d8 Extract auto-close duplicates script to TypeScript with Bun
- Move GitHub workflow script logic to scripts/auto-close-duplicates.ts
- Convert from inline JavaScript to standalone TypeScript module
- Update workflow to use Bun instead of Node.js for better performance
- Add proper TypeScript types and ES module syntax

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-08 11:11:41 -07:00
GitHub Actions
3fd750e3cb chore: Update CHANGELOG.md 2025-08-07 22:22:12 +00:00
GitHub Actions
3262b673c3 chore: Update CHANGELOG.md 2025-08-07 05:48:29 +00:00
Boris Cherny
369bd9b6d7 Merge pull request #5116 from anthropics/boris/fwbe
Add auto-close functionality for duplicate issues
2025-08-05 17:18:29 -07:00
Boris Cherny
611956def4 Convert auto-close duplicates workflow to dry run mode
Instead of automatically closing duplicate issues, the workflow now:
- Logs URLs of issues that would have been closed
- Runs in dry run mode for safety
- Preserves all detection logic but skips actual closing actions

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-05 17:17:51 -07:00
GitHub Actions
a9f1fefbe9 chore: Update CHANGELOG.md 2025-08-05 16:50:00 +00:00
Boris Cherny
3d5ef4e8c0 Add auto-close functionality for duplicate issues
- Update dedupe command to include auto-close warning
- Add commit-push-pr command for streamlined workflow
- Add GitHub workflow to automatically close duplicates

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-04 17:09:09 -07:00
GitHub Actions
d967bbbe30 chore: Update CHANGELOG.md 2025-08-04 23:29:26 +00:00
GitHub Actions
5faa082d6e chore: Update CHANGELOG.md 2025-07-31 23:39:25 +00:00
10 changed files with 943 additions and 12 deletions

View File

@@ -0,0 +1,19 @@
---
allowed-tools: Bash(git checkout --branch:*), Bash(git add:*), Bash(git status:*), Bash(git push:*), Bash(git commit:*), Bash(gh pr create:*)
description: Commit, push, and open a PR
---
## Context
- Current git status: !`git status`
- Current git diff (staged and unstaged changes): !`git diff HEAD`
- Current branch: !`git branch --show-current`
## Your task
Based on the above changes:
1. Create a new branch if on main
2. Create a single commit with an appropriate message
3. Push the branch to origin
4. Create a pull request using `gh pr create`
5. You have the capability to call multiple tools in a single response. You MUST do all of the above in a single message. Do not use any other tools or do anything else. Do not send any other text or messages besides these tool calls.

View File

@@ -28,7 +28,10 @@ Found 3 possible duplicate issues:
2. <link to issue>
3. <link to issue>
If your issue is a duplicate, please close it and 👍 the existing issue instead.
This issue will be automatically closed as a duplicate in 3 days.
- If your issue is a duplicate, please close it and 👍 the existing issue instead
- To prevent auto-closure, add a comment or 👎 this comment
🤖 Generated with [Claude Code](https://claude.ai/code)

View File

@@ -0,0 +1,31 @@
name: Auto-close duplicate issues
description: Auto-closes issues that are duplicates of existing issues
on:
schedule:
- cron: "0 9 * * *"
workflow_dispatch:
jobs:
auto-close-duplicates:
runs-on: ubuntu-latest
timeout-minutes: 10
permissions:
contents: read
issues: read
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: latest
- name: Auto-close duplicate issues
run: bun run scripts/auto-close-duplicates.ts
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_REPOSITORY_OWNER: ${{ github.repository_owner }}
GITHUB_REPOSITORY_NAME: ${{ github.event.repository.name }}
STATSIG_API_KEY: ${{ secrets.STATSIG_API_KEY }}

View File

@@ -0,0 +1,44 @@
name: Backfill Duplicate Comments
description: Triggers duplicate detection for old issues that don't have duplicate comments
on:
workflow_dispatch:
inputs:
days_back:
description: 'How many days back to look for old issues'
required: false
default: '90'
type: string
dry_run:
description: 'Dry run mode (true to only log what would be done)'
required: false
default: 'true'
type: choice
options:
- 'true'
- 'false'
jobs:
backfill-duplicate-comments:
runs-on: ubuntu-latest
timeout-minutes: 30
permissions:
contents: read
issues: read
actions: write
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: latest
- name: Backfill duplicate comments
run: bun run scripts/backfill-duplicate-comments.ts
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DAYS_BACK: ${{ inputs.days_back }}
DRY_RUN: ${{ inputs.dry_run }}

View File

@@ -3,6 +3,12 @@ description: Automatically dedupe GitHub issues using Claude Code
on:
issues:
types: [opened]
workflow_dispatch:
inputs:
issue_number:
description: 'Issue number to process for duplicate detection'
required: true
type: string
jobs:
claude-dedupe-issues:
@@ -19,7 +25,56 @@ jobs:
- name: Run Claude Code slash command
uses: anthropics/claude-code-base-action@beta
with:
prompt: "/dedupe ${{ github.repository }}/issues/${{ github.event.issue.number }}"
prompt: "/dedupe ${{ github.repository }}/issues/${{ github.event.issue.number || inputs.issue_number }}"
anthropic_api_key: ${{ secrets.ANTHROPIC_API_KEY }}
claude_env: |
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Log duplicate comment event to Statsig
if: always()
env:
STATSIG_API_KEY: ${{ secrets.STATSIG_API_KEY }}
run: |
ISSUE_NUMBER=${{ github.event.issue.number || inputs.issue_number }}
REPO=${{ github.repository }}
if [ -z "$STATSIG_API_KEY" ]; then
echo "STATSIG_API_KEY not found, skipping Statsig logging"
exit 0
fi
# Prepare the event payload
EVENT_PAYLOAD=$(jq -n \
--arg issue_number "$ISSUE_NUMBER" \
--arg repo "$REPO" \
--arg triggered_by "${{ github.event_name }}" \
'{
events: [{
eventName: "github_duplicate_comment_added",
value: 1,
metadata: {
repository: $repo,
issue_number: ($issue_number | tonumber),
triggered_by: $triggered_by,
workflow_run_id: "${{ github.run_id }}"
},
time: (now | floor | tostring)
}]
}')
# Send to Statsig API
echo "Logging duplicate comment event to Statsig for issue #${ISSUE_NUMBER}"
RESPONSE=$(curl -s -w "\n%{http_code}" -X POST https://events.statsigapi.net/v1/log_event \
-H "Content-Type: application/json" \
-H "STATSIG-API-KEY: ${STATSIG_API_KEY}" \
-d "$EVENT_PAYLOAD")
HTTP_CODE=$(echo "$RESPONSE" | tail -n1)
BODY=$(echo "$RESPONSE" | head -n-1)
if [ "$HTTP_CODE" -eq 200 ] || [ "$HTTP_CODE" -eq 202 ]; then
echo "Successfully logged duplicate comment event for issue #${ISSUE_NUMBER}"
else
echo "Failed to log duplicate comment event for issue #${ISSUE_NUMBER}. HTTP ${HTTP_CODE}: ${BODY}"
fi

View File

@@ -13,16 +13,80 @@ concurrency:
group: lock-threads
jobs:
lock-threads:
lock-closed-issues:
runs-on: ubuntu-latest
steps:
- uses: dessant/lock-threads@1bf7ec25051fe7c00bdd17e6a7cf3d7bfb7dc771 # v5.0.1
- name: Lock closed issues after 7 days of inactivity
uses: actions/github-script@v7
with:
issue-inactive-days: "7"
process-only: "issues"
log-output: true
issue-comment: >
This issue has been automatically locked since it was
closed and has not had any activity for 7 days.
If you're experiencing a similar issue, please file a new issue
and reference this one if it's relevant.
script: |
const sevenDaysAgo = new Date();
sevenDaysAgo.setDate(sevenDaysAgo.getDate() - 7);
const lockComment = `This issue has been automatically locked since it was closed and has not had any activity for 7 days. If you're experiencing a similar issue, please file a new issue and reference this one if it's relevant.`;
let page = 1;
let hasMore = true;
let totalLocked = 0;
while (hasMore) {
// Get closed issues (pagination)
const { data: issues } = await github.rest.issues.listForRepo({
owner: context.repo.owner,
repo: context.repo.repo,
state: 'closed',
sort: 'updated',
direction: 'asc',
per_page: 100,
page: page
});
if (issues.length === 0) {
hasMore = false;
break;
}
for (const issue of issues) {
// Skip if already locked
if (issue.locked) continue;
// Skip pull requests
if (issue.pull_request) continue;
// Check if updated more than 7 days ago
const updatedAt = new Date(issue.updated_at);
if (updatedAt > sevenDaysAgo) {
// Since issues are sorted by updated_at ascending,
// once we hit a recent issue, all remaining will be recent too
hasMore = false;
break;
}
try {
// Add comment before locking
await github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issue.number,
body: lockComment
});
// Lock the issue
await github.rest.issues.lock({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issue.number,
lock_reason: 'resolved'
});
totalLocked++;
console.log(`Locked issue #${issue.number}: ${issue.title}`);
} catch (error) {
console.error(`Failed to lock issue #${issue.number}: ${error.message}`);
}
}
page++;
}
console.log(`Total issues locked: ${totalLocked}`);

180
.github/workflows/log-issue-events.yml vendored Normal file
View File

@@ -0,0 +1,180 @@
name: Log GitHub Issue Events
on:
issues:
types: [opened, closed]
jobs:
log-issue-created:
if: github.event.action == 'opened'
runs-on: ubuntu-latest
timeout-minutes: 5
permissions:
contents: read
issues: read
steps:
- name: Log issue creation to Statsig
env:
STATSIG_API_KEY: ${{ secrets.STATSIG_API_KEY }}
run: |
ISSUE_NUMBER=${{ github.event.issue.number }}
REPO=${{ github.repository }}
ISSUE_TITLE="${{ github.event.issue.title }}"
AUTHOR="${{ github.event.issue.user.login }}"
CREATED_AT="${{ github.event.issue.created_at }}"
if [ -z "$STATSIG_API_KEY" ]; then
echo "STATSIG_API_KEY not found, skipping Statsig logging"
exit 0
fi
# Prepare the event payload
EVENT_PAYLOAD=$(jq -n \
--arg issue_number "$ISSUE_NUMBER" \
--arg repo "$REPO" \
--arg title "$ISSUE_TITLE" \
--arg author "$AUTHOR" \
--arg created_at "$CREATED_AT" \
'{
events: [{
eventName: "github_issue_created",
value: 1,
metadata: {
repository: $repo,
issue_number: ($issue_number | tonumber),
issue_title: $title,
issue_author: $author,
created_at: $created_at
},
time: (now | floor | tostring)
}]
}')
# Send to Statsig API
echo "Logging issue creation to Statsig for issue #${ISSUE_NUMBER}"
RESPONSE=$(curl -s -w "\n%{http_code}" -X POST https://events.statsigapi.net/v1/log_event \
-H "Content-Type: application/json" \
-H "STATSIG-API-KEY: ${STATSIG_API_KEY}" \
-d "$EVENT_PAYLOAD")
HTTP_CODE=$(echo "$RESPONSE" | tail -n1)
BODY=$(echo "$RESPONSE" | head -n-1)
if [ "$HTTP_CODE" -eq 200 ] || [ "$HTTP_CODE" -eq 202 ]; then
echo "Successfully logged issue creation for issue #${ISSUE_NUMBER}"
else
echo "Failed to log issue creation for issue #${ISSUE_NUMBER}. HTTP ${HTTP_CODE}: ${BODY}"
fi
log-issue-closed:
if: github.event.action == 'closed'
runs-on: ubuntu-latest
timeout-minutes: 5
permissions:
contents: read
issues: read
steps:
- name: Log issue closure to Statsig
env:
STATSIG_API_KEY: ${{ secrets.STATSIG_API_KEY }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
ISSUE_NUMBER=${{ github.event.issue.number }}
REPO=${{ github.repository }}
ISSUE_TITLE="${{ github.event.issue.title }}"
CLOSED_BY="${{ github.event.issue.closed_by.login }}"
CLOSED_AT="${{ github.event.issue.closed_at }}"
STATE_REASON="${{ github.event.issue.state_reason }}"
if [ -z "$STATSIG_API_KEY" ]; then
echo "STATSIG_API_KEY not found, skipping Statsig logging"
exit 0
fi
# Get additional issue data via GitHub API
echo "Fetching additional issue data for #${ISSUE_NUMBER}"
ISSUE_DATA=$(curl -s -H "Authorization: token ${GITHUB_TOKEN}" \
-H "Accept: application/vnd.github.v3+json" \
"https://api.github.com/repos/${REPO}/issues/${ISSUE_NUMBER}")
COMMENTS_COUNT=$(echo "$ISSUE_DATA" | jq -r '.comments')
# Get reactions data
REACTIONS_DATA=$(curl -s -H "Authorization: token ${GITHUB_TOKEN}" \
-H "Accept: application/vnd.github.v3+json" \
"https://api.github.com/repos/${REPO}/issues/${ISSUE_NUMBER}/reactions")
REACTIONS_COUNT=$(echo "$REACTIONS_DATA" | jq '. | length')
# Check if issue was closed automatically (by checking if closed_by is a bot)
CLOSED_AUTOMATICALLY="false"
if [[ "$CLOSED_BY" == *"[bot]"* ]]; then
CLOSED_AUTOMATICALLY="true"
fi
# Check if closed as duplicate by looking for duplicate label or state_reason
CLOSED_AS_DUPLICATE="false"
if [ "$STATE_REASON" = "not_planned" ]; then
# Check if issue has duplicate label
LABELS=$(echo "$ISSUE_DATA" | jq -r '.labels[] | select(.name | test("duplicate"; "i")) | .name')
if [ -n "$LABELS" ]; then
CLOSED_AS_DUPLICATE="true"
fi
fi
# Prepare the event payload
EVENT_PAYLOAD=$(jq -n \
--arg issue_number "$ISSUE_NUMBER" \
--arg repo "$REPO" \
--arg title "$ISSUE_TITLE" \
--arg closed_by "$CLOSED_BY" \
--arg closed_at "$CLOSED_AT" \
--arg state_reason "$STATE_REASON" \
--arg comments_count "$COMMENTS_COUNT" \
--arg reactions_count "$REACTIONS_COUNT" \
--arg closed_automatically "$CLOSED_AUTOMATICALLY" \
--arg closed_as_duplicate "$CLOSED_AS_DUPLICATE" \
'{
events: [{
eventName: "github_issue_closed",
value: 1,
metadata: {
repository: $repo,
issue_number: ($issue_number | tonumber),
issue_title: $title,
closed_by: $closed_by,
closed_at: $closed_at,
state_reason: $state_reason,
comments_count: ($comments_count | tonumber),
reactions_count: ($reactions_count | tonumber),
closed_automatically: ($closed_automatically | test("true")),
closed_as_duplicate: ($closed_as_duplicate | test("true"))
},
time: (now | floor | tostring)
}]
}')
# Send to Statsig API
echo "Logging issue closure to Statsig for issue #${ISSUE_NUMBER}"
RESPONSE=$(curl -s -w "\n%{http_code}" -X POST https://events.statsigapi.net/v1/log_event \
-H "Content-Type: application/json" \
-H "STATSIG-API-KEY: ${STATSIG_API_KEY}" \
-d "$EVENT_PAYLOAD")
HTTP_CODE=$(echo "$RESPONSE" | tail -n1)
BODY=$(echo "$RESPONSE" | head -n-1)
if [ "$HTTP_CODE" -eq 200 ] || [ "$HTTP_CODE" -eq 202 ]; then
echo "Successfully logged issue closure for issue #${ISSUE_NUMBER}"
echo "Closed by: $CLOSED_BY"
echo "Comments: $COMMENTS_COUNT"
echo "Reactions: $REACTIONS_COUNT"
echo "Closed automatically: $CLOSED_AUTOMATICALLY"
echo "Closed as duplicate: $CLOSED_AS_DUPLICATE"
else
echo "Failed to log issue closure for issue #${ISSUE_NUMBER}. HTTP ${HTTP_CODE}: ${BODY}"
fi

View File

@@ -1,5 +1,35 @@
# Changelog
## 1.0.71
- Background commands: (Ctrl-b) to run any Bash command in the background so Claude can keep working (great for dev servers, tailing logs, etc.)
- Customizable status line: add your terminal prompt to Claude Code with /statusline
## 1.0.70
- Performance: Optimized message rendering for better performance with large contexts
- Windows: Fixed native file search, ripgrep, and subagent functionality
- Added support for @-mentions in slash command arguments
## 1.0.69
- Upgraded Opus to version 4.1
## 1.0.68
- Fix incorrect model names being used for certain commands like `/pr-comments`
- Windows: improve permissions checks for allow / deny tools and project trust. This may create a new project entry in `.claude.json` - manually merge the history field if desired.
- Windows: improve sub-process spawning to eliminate "No such file or directory" when running commands like pnpm
- Enhanced /doctor command with CLAUDE.md and MCP tool context for self-serve debugging
- SDK: Added canUseTool callback support for tool confirmation
- Added `disableAllHooks` setting
- Improved file suggestions performance in large repos
## 1.0.65
- IDE: Fixed connection stability issues and error handling for diagnostics
- Windows: Fixed shell environment setup for users without .bashrc files
## 1.0.64
- Agents: Added model customization support - you can now specify which model an agent should use

View File

@@ -0,0 +1,307 @@
#!/usr/bin/env bun
declare global {
var process: {
env: Record<string, string | undefined>;
};
}
interface GitHubIssue {
number: number;
title: string;
user: { id: number };
created_at: string;
}
interface GitHubComment {
id: number;
body: string;
created_at: string;
user: { type: string; id: number };
}
interface GitHubReaction {
user: { id: number };
content: string;
}
async function githubRequest<T>(endpoint: string, token: string, method: string = 'GET', body?: any): Promise<T> {
const response = await fetch(`https://api.github.com${endpoint}`, {
method,
headers: {
Authorization: `Bearer ${token}`,
Accept: "application/vnd.github.v3+json",
"User-Agent": "auto-close-duplicates-script",
...(body && { "Content-Type": "application/json" }),
},
...(body && { body: JSON.stringify(body) }),
});
if (!response.ok) {
throw new Error(
`GitHub API request failed: ${response.status} ${response.statusText}`
);
}
return response.json();
}
function extractDuplicateIssueNumber(commentBody: string): number | null {
const match = commentBody.match(/#(\d+)/);
return match ? parseInt(match[1], 10) : null;
}
async function logStatsigEvent(eventName: string, value: number, metadata: Record<string, any>): Promise<void> {
const statsigApiKey = process.env.STATSIG_API_KEY;
if (!statsigApiKey) {
console.log("[DEBUG] STATSIG_API_KEY not found, skipping Statsig logging");
return;
}
const eventPayload = {
events: [{
eventName,
value,
metadata,
time: Math.floor(Date.now()).toString()
}]
};
try {
const response = await fetch('https://events.statsigapi.net/v1/log_event', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'STATSIG-API-KEY': statsigApiKey
},
body: JSON.stringify(eventPayload)
});
if (response.ok) {
console.log(`[DEBUG] Successfully logged Statsig event: ${eventName}`);
} else {
console.log(`[DEBUG] Failed to log Statsig event: ${response.status} ${response.statusText}`);
}
} catch (error) {
console.log(`[DEBUG] Error logging to Statsig: ${error}`);
}
}
async function closeIssueAsDuplicate(
owner: string,
repo: string,
issueNumber: number,
duplicateOfNumber: number,
token: string
): Promise<void> {
await githubRequest(
`/repos/${owner}/${repo}/issues/${issueNumber}`,
token,
'PATCH',
{
state: 'closed',
state_reason: 'not_planned'
}
);
await githubRequest(
`/repos/${owner}/${repo}/issues/${issueNumber}/comments`,
token,
'POST',
{
body: `This issue has been automatically closed as a duplicate of #${duplicateOfNumber}.
If this is incorrect, please re-open this issue or create a new one.
🤖 Generated with [Claude Code](https://claude.ai/code)`
}
);
// Log to Statsig
await logStatsigEvent('github_issue_closed_as_duplicate', 1, {
repository: `${owner}/${repo}`,
issue_number: issueNumber,
duplicate_of_issue: duplicateOfNumber,
closed_by: 'auto-close-script'
});
}
async function autoCloseDuplicates(): Promise<void> {
console.log("[DEBUG] Starting auto-close duplicates script");
const token = process.env.GITHUB_TOKEN;
if (!token) {
throw new Error("GITHUB_TOKEN environment variable is required");
}
console.log("[DEBUG] GitHub token found");
const owner = process.env.GITHUB_REPOSITORY_OWNER || "anthropics";
const repo = process.env.GITHUB_REPOSITORY_NAME || "claude-code";
console.log(`[DEBUG] Repository: ${owner}/${repo}`);
const threeDaysAgo = new Date();
threeDaysAgo.setDate(threeDaysAgo.getDate() - 3);
console.log(
`[DEBUG] Checking for duplicate comments older than: ${threeDaysAgo.toISOString()}`
);
console.log("[DEBUG] Fetching open issues created more than 3 days ago...");
const allIssues: GitHubIssue[] = [];
let page = 1;
const perPage = 100;
while (true) {
const pageIssues: GitHubIssue[] = await githubRequest(
`/repos/${owner}/${repo}/issues?state=open&per_page=${perPage}&page=${page}`,
token
);
if (pageIssues.length === 0) break;
// Filter for issues created more than 3 days ago
const oldEnoughIssues = pageIssues.filter(issue =>
new Date(issue.created_at) <= threeDaysAgo
);
allIssues.push(...oldEnoughIssues);
page++;
// Safety limit to avoid infinite loops
if (page > 20) break;
}
const issues = allIssues;
console.log(`[DEBUG] Found ${issues.length} open issues`);
let processedCount = 0;
let candidateCount = 0;
for (const issue of issues) {
processedCount++;
console.log(
`[DEBUG] Processing issue #${issue.number} (${processedCount}/${issues.length}): ${issue.title}`
);
console.log(`[DEBUG] Fetching comments for issue #${issue.number}...`);
const comments: GitHubComment[] = await githubRequest(
`/repos/${owner}/${repo}/issues/${issue.number}/comments`,
token
);
console.log(
`[DEBUG] Issue #${issue.number} has ${comments.length} comments`
);
const dupeComments = comments.filter(
(comment) =>
comment.body.includes("Found") &&
comment.body.includes("possible duplicate") &&
comment.user.type === "Bot"
);
console.log(
`[DEBUG] Issue #${issue.number} has ${dupeComments.length} duplicate detection comments`
);
if (dupeComments.length === 0) {
console.log(
`[DEBUG] Issue #${issue.number} - no duplicate comments found, skipping`
);
continue;
}
const lastDupeComment = dupeComments[dupeComments.length - 1];
const dupeCommentDate = new Date(lastDupeComment.created_at);
console.log(
`[DEBUG] Issue #${
issue.number
} - most recent duplicate comment from: ${dupeCommentDate.toISOString()}`
);
if (dupeCommentDate > threeDaysAgo) {
console.log(
`[DEBUG] Issue #${issue.number} - duplicate comment is too recent, skipping`
);
continue;
}
console.log(
`[DEBUG] Issue #${
issue.number
} - duplicate comment is old enough (${Math.floor(
(Date.now() - dupeCommentDate.getTime()) / (1000 * 60 * 60 * 24)
)} days)`
);
const commentsAfterDupe = comments.filter(
(comment) => new Date(comment.created_at) > dupeCommentDate
);
console.log(
`[DEBUG] Issue #${issue.number} - ${commentsAfterDupe.length} comments after duplicate detection`
);
if (commentsAfterDupe.length > 0) {
console.log(
`[DEBUG] Issue #${issue.number} - has activity after duplicate comment, skipping`
);
continue;
}
console.log(
`[DEBUG] Issue #${issue.number} - checking reactions on duplicate comment...`
);
const reactions: GitHubReaction[] = await githubRequest(
`/repos/${owner}/${repo}/issues/comments/${lastDupeComment.id}/reactions`,
token
);
console.log(
`[DEBUG] Issue #${issue.number} - duplicate comment has ${reactions.length} reactions`
);
const authorThumbsDown = reactions.some(
(reaction) =>
reaction.user.id === issue.user.id && reaction.content === "-1"
);
console.log(
`[DEBUG] Issue #${issue.number} - author thumbs down reaction: ${authorThumbsDown}`
);
if (authorThumbsDown) {
console.log(
`[DEBUG] Issue #${issue.number} - author disagreed with duplicate detection, skipping`
);
continue;
}
const duplicateIssueNumber = extractDuplicateIssueNumber(lastDupeComment.body);
if (!duplicateIssueNumber) {
console.log(
`[DEBUG] Issue #${issue.number} - could not extract duplicate issue number from comment, skipping`
);
continue;
}
candidateCount++;
const issueUrl = `https://github.com/${owner}/${repo}/issues/${issue.number}`;
try {
console.log(
`[INFO] Auto-closing issue #${issue.number} as duplicate of #${duplicateIssueNumber}: ${issueUrl}`
);
await closeIssueAsDuplicate(owner, repo, issue.number, duplicateIssueNumber, token);
console.log(
`[SUCCESS] Successfully closed issue #${issue.number} as duplicate of #${duplicateIssueNumber}`
);
} catch (error) {
console.error(
`[ERROR] Failed to close issue #${issue.number} as duplicate: ${error}`
);
}
}
console.log(
`[DEBUG] Script completed. Processed ${processedCount} issues, found ${candidateCount} candidates for auto-close`
);
}
autoCloseDuplicates().catch(console.error);
// Make it a module
export {};

View File

@@ -0,0 +1,198 @@
#!/usr/bin/env bun
declare global {
var process: {
env: Record<string, string | undefined>;
};
}
interface GitHubIssue {
number: number;
title: string;
state: string;
state_reason?: string;
user: { id: number };
created_at: string;
closed_at?: string;
}
interface GitHubComment {
id: number;
body: string;
created_at: string;
user: { type: string; id: number };
}
async function githubRequest<T>(endpoint: string, token: string, method: string = 'GET', body?: any): Promise<T> {
const response = await fetch(`https://api.github.com${endpoint}`, {
method,
headers: {
Authorization: `Bearer ${token}`,
Accept: "application/vnd.github.v3+json",
"User-Agent": "backfill-duplicate-comments-script",
...(body && { "Content-Type": "application/json" }),
},
...(body && { body: JSON.stringify(body) }),
});
if (!response.ok) {
throw new Error(
`GitHub API request failed: ${response.status} ${response.statusText}`
);
}
return response.json();
}
async function triggerDedupeWorkflow(
owner: string,
repo: string,
issueNumber: number,
token: string,
dryRun: boolean = true
): Promise<void> {
if (dryRun) {
console.log(`[DRY RUN] Would trigger dedupe workflow for issue #${issueNumber}`);
return;
}
await githubRequest(
`/repos/${owner}/${repo}/actions/workflows/claude-dedupe-issues.yml/dispatches`,
token,
'POST',
{
ref: 'main',
inputs: {
issue_number: issueNumber.toString()
}
}
);
}
async function backfillDuplicateComments(): Promise<void> {
console.log("[DEBUG] Starting backfill duplicate comments script");
const token = process.env.GITHUB_TOKEN;
if (!token) {
throw new Error(`GITHUB_TOKEN environment variable is required
Usage:
GITHUB_TOKEN=your_token bun run scripts/backfill-duplicate-comments.ts
Environment Variables:
GITHUB_TOKEN - GitHub personal access token with repo and actions permissions (required)
DRY_RUN - Set to "false" to actually trigger workflows (default: true for safety)
DAYS_BACK - How many days back to look for old issues (default: 90)`);
}
console.log("[DEBUG] GitHub token found");
const owner = "anthropics";
const repo = "claude-code";
const dryRun = process.env.DRY_RUN !== "false";
const daysBack = parseInt(process.env.DAYS_BACK || "90", 10);
console.log(`[DEBUG] Repository: ${owner}/${repo}`);
console.log(`[DEBUG] Dry run mode: ${dryRun}`);
console.log(`[DEBUG] Looking back ${daysBack} days`);
const cutoffDate = new Date();
cutoffDate.setDate(cutoffDate.getDate() - daysBack);
console.log(`[DEBUG] Fetching issues created since ${cutoffDate.toISOString()}...`);
const allIssues: GitHubIssue[] = [];
let page = 1;
const perPage = 100;
while (true) {
const pageIssues: GitHubIssue[] = await githubRequest(
`/repos/${owner}/${repo}/issues?state=all&per_page=${perPage}&page=${page}&since=${cutoffDate.toISOString()}`,
token
);
if (pageIssues.length === 0) break;
allIssues.push(...pageIssues);
page++;
// Safety limit to avoid infinite loops
if (page > 100) {
console.log("[DEBUG] Reached page limit, stopping pagination");
break;
}
}
console.log(`[DEBUG] Found ${allIssues.length} issues from the last ${daysBack} days`);
let processedCount = 0;
let candidateCount = 0;
let triggeredCount = 0;
for (const issue of allIssues) {
processedCount++;
console.log(
`[DEBUG] Processing issue #${issue.number} (${processedCount}/${allIssues.length}): ${issue.title}`
);
console.log(`[DEBUG] Fetching comments for issue #${issue.number}...`);
const comments: GitHubComment[] = await githubRequest(
`/repos/${owner}/${repo}/issues/${issue.number}/comments`,
token
);
console.log(
`[DEBUG] Issue #${issue.number} has ${comments.length} comments`
);
// Look for existing duplicate detection comments (from the dedupe bot)
const dupeDetectionComments = comments.filter(
(comment) =>
comment.body.includes("Found") &&
comment.body.includes("possible duplicate") &&
comment.user.type === "Bot"
);
console.log(
`[DEBUG] Issue #${issue.number} has ${dupeDetectionComments.length} duplicate detection comments`
);
// Skip if there's already a duplicate detection comment
if (dupeDetectionComments.length > 0) {
console.log(
`[DEBUG] Issue #${issue.number} already has duplicate detection comment, skipping`
);
continue;
}
candidateCount++;
const issueUrl = `https://github.com/${owner}/${repo}/issues/${issue.number}`;
try {
console.log(
`[INFO] ${dryRun ? '[DRY RUN] ' : ''}Triggering dedupe workflow for issue #${issue.number}: ${issueUrl}`
);
await triggerDedupeWorkflow(owner, repo, issue.number, token, dryRun);
if (!dryRun) {
console.log(
`[SUCCESS] Successfully triggered dedupe workflow for issue #${issue.number}`
);
}
triggeredCount++;
} catch (error) {
console.error(
`[ERROR] Failed to trigger workflow for issue #${issue.number}: ${error}`
);
}
// Add a delay between workflow triggers to avoid overwhelming the system
await new Promise(resolve => setTimeout(resolve, 1000));
}
console.log(
`[DEBUG] Script completed. Processed ${processedCount} issues, found ${candidateCount} candidates without duplicate comments, ${dryRun ? 'would trigger' : 'triggered'} ${triggeredCount} workflows`
);
}
backfillDuplicateComments().catch(console.error);
// Make it a module
export {};