Compare commits

...

28 Commits

Author SHA1 Message Date
Boris Cherny
d7aa61e6f1 Add script to backfill duplicate comments for old issues
Creates a script that identifies old issues without duplicate detection comments and triggers the existing claude-dedupe-issues workflow for each one. This helps ensure historical issues get proper duplicate detection coverage.

Features:
- Scans issues from configurable time period (default 30 days)
- Skips issues that already have duplicate detection comments
- Triggers existing workflow instead of duplicating logic
- Includes dry-run mode and rate limiting for safety

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-08 12:06:36 -07:00
Boris Cherny
072856dd5b Merge pull request #5411 from anthropics/boris/gity
Update auto-close-duplicates script to actually close issues
2025-08-08 11:54:07 -07:00
Boris Cherny
1cc90e9b78 Update auto-close-duplicates script to actually close issues
- Remove dry run mode and implement actual issue closing
- Extract duplicate issue number from bot comments
- Close issues via GitHub API with proper state and comments
- Add error handling for API failures
- Use Claude Code comment format with reopening instructions

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-08 11:52:34 -07:00
Boris Cherny
483e0e892f Merge pull request #5409 from anthropics/boris/cuxg
Improve auto-close duplicates script pagination and filtering
2025-08-08 11:31:03 -07:00
GitHub Actions
5248fa06bc chore: Update CHANGELOG.md 2025-08-08 18:22:13 +00:00
Boris Cherny
eb48a37a48 Improve auto-close duplicates script pagination and filtering
- Add pagination to fetch more than 100 issues (up to 20 pages/2000 issues)
- Filter to only process issues created more than 3 days ago
- Add created_at field to GitHubIssue interface

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-08 11:21:27 -07:00
Boris Cherny
fdb7efe6f1 Merge pull request #5358 from anthropics/boris/eyda
Extract auto-close duplicates script to TypeScript with Bun
2025-08-08 11:13:27 -07:00
Boris Cherny
7060992aa5 Update .github/workflows/auto-close-duplicates.yml
Co-authored-by: Ashwin Bhat <ashwin@anthropic.com>
2025-08-08 11:13:05 -07:00
Boris Cherny
b7116c78d8 Extract auto-close duplicates script to TypeScript with Bun
- Move GitHub workflow script logic to scripts/auto-close-duplicates.ts
- Convert from inline JavaScript to standalone TypeScript module
- Update workflow to use Bun instead of Node.js for better performance
- Add proper TypeScript types and ES module syntax

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-08 11:11:41 -07:00
GitHub Actions
3fd750e3cb chore: Update CHANGELOG.md 2025-08-07 22:22:12 +00:00
GitHub Actions
3262b673c3 chore: Update CHANGELOG.md 2025-08-07 05:48:29 +00:00
Boris Cherny
369bd9b6d7 Merge pull request #5116 from anthropics/boris/fwbe
Add auto-close functionality for duplicate issues
2025-08-05 17:18:29 -07:00
Boris Cherny
611956def4 Convert auto-close duplicates workflow to dry run mode
Instead of automatically closing duplicate issues, the workflow now:
- Logs URLs of issues that would have been closed
- Runs in dry run mode for safety
- Preserves all detection logic but skips actual closing actions

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-05 17:17:51 -07:00
GitHub Actions
a9f1fefbe9 chore: Update CHANGELOG.md 2025-08-05 16:50:00 +00:00
Boris Cherny
3d5ef4e8c0 Add auto-close functionality for duplicate issues
- Update dedupe command to include auto-close warning
- Add commit-push-pr command for streamlined workflow
- Add GitHub workflow to automatically close duplicates

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-04 17:09:09 -07:00
GitHub Actions
d967bbbe30 chore: Update CHANGELOG.md 2025-08-04 23:29:26 +00:00
GitHub Actions
5faa082d6e chore: Update CHANGELOG.md 2025-07-31 23:39:25 +00:00
Boris Cherny
60124df21f Merge pull request #4817 from anthropics/bcherny-patch-4
Update dedupe.md
2025-07-30 19:30:06 -07:00
Boris Cherny
1aaffa4e6d Update dedupe.md 2025-07-30 19:29:10 -07:00
Boris Cherny
f5b24d5480 Merge pull request #4816 from anthropics/bcherny-patch-3
Update claude-dedupe-issues.yml
2025-07-30 19:16:36 -07:00
Boris Cherny
07e6bec5ff Update claude-dedupe-issues.yml 2025-07-30 19:09:48 -07:00
GitHub Actions
cf8c6fdf2d chore: Update CHANGELOG.md 2025-07-30 21:49:14 +00:00
ant-kurt
d720bf7aba Merge pull request #4644 from shota-0129/shota-0129/feat-docker-dns-protection
feat: Add Docker DNS protection to firewall script
2025-07-30 10:56:41 -07:00
Kurt Carpenter
dde41f6225 fix: Handle missing Docker DNS rules gracefully in firewall script
Add error handling for grep when no Docker DNS rules exist, preventing
script failure on default Docker networks while still preserving DNS
functionality on custom networks.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-07-30 10:54:05 -07:00
shota-0129
78e0785950 simplify Docker DNS restoration using grep and xargs approach 2025-07-30 14:22:15 +09:00
GitHub Actions
b3658880e5 chore: Update CHANGELOG.md 2025-07-29 21:43:06 +00:00
Boris Cherny
9be8e07e92 Merge pull request #4710 from anthropics/boris/nvko
Add GitHub workflow to deduplicate issues
2025-07-29 11:36:52 -07:00
shota-0129
8b2cbe3f86 feat: Add Docker DNS protection to firewall script
- Extract Docker DNS NAT ports before iptables cleanup
- Restore Docker DNS NAT rules after iptables reset
- Maintains Docker container DNS functionality in network isolation
- Fixes connection refused errors to 127.0.0.11 in Docker environments

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-07-29 13:54:29 +09:00
8 changed files with 573 additions and 3 deletions

View File

@@ -0,0 +1,19 @@
---
allowed-tools: Bash(git checkout --branch:*), Bash(git add:*), Bash(git status:*), Bash(git push:*), Bash(git commit:*), Bash(gh pr create:*)
description: Commit, push, and open a PR
---
## Context
- Current git status: !`git status`
- Current git diff (staged and unstaged changes): !`git diff HEAD`
- Current branch: !`git branch --show-current`
## Your task
Based on the above changes:
1. Create a new branch if on main
2. Create a single commit with an appropriate message
3. Push the branch to origin
4. Create a pull request using `gh pr create`
5. You have the capability to call multiple tools in a single response. You MUST do all of the above in a single message. Do not use any other tools or do anything else. Do not send any other text or messages besides these tool calls.

View File

@@ -28,6 +28,11 @@ Found 3 possible duplicate issues:
2. <link to issue>
3. <link to issue>
If your issue is a duplicate, please close it and 👍 the existing issue instead.
This issue will be automatically closed as a duplicate in 3 days.
## 🤖 Generated with [Claude Code](https://claude.ai/code)
- If your issue is a duplicate, please close it and 👍 the existing issue instead
- To prevent auto-closure, add a comment or 👎 this comment
🤖 Generated with [Claude Code](https://claude.ai/code)
---

View File

@@ -2,6 +2,9 @@
set -euo pipefail # Exit on error, undefined vars, and pipeline failures
IFS=$'\n\t' # Stricter word splitting
# 1. Extract Docker DNS info BEFORE any flushing
DOCKER_DNS_RULES=$(iptables-save -t nat | grep "127\.0\.0\.11" || true)
# Flush existing rules and delete existing ipsets
iptables -F
iptables -X
@@ -11,6 +14,16 @@ iptables -t mangle -F
iptables -t mangle -X
ipset destroy allowed-domains 2>/dev/null || true
# 2. Selectively restore ONLY internal Docker DNS resolution
if [ -n "$DOCKER_DNS_RULES" ]; then
echo "Restoring Docker DNS rules..."
iptables -t nat -N DOCKER_OUTPUT 2>/dev/null || true
iptables -t nat -N DOCKER_POSTROUTING 2>/dev/null || true
echo "$DOCKER_DNS_RULES" | xargs -L 1 iptables -t nat
else
echo "No Docker DNS rules to restore"
fi
# First allow DNS and localhost before any restrictions
# Allow outbound DNS
iptables -A OUTPUT -p udp --dport 53 -j ACCEPT

View File

@@ -0,0 +1,30 @@
name: Auto-close duplicate issues (DRY RUN)
description: Dry run - logs issues that would be auto-closed as duplicates after 3 days if no response
on:
schedule:
- cron: "0 9 * * *"
workflow_dispatch:
jobs:
auto-close-duplicates:
runs-on: ubuntu-latest
timeout-minutes: 10
permissions:
contents: read
issues: read
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: latest
- name: Auto-close duplicate issues
run: bun run scripts/auto-close-duplicates.ts
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_REPOSITORY_OWNER: ${{ github.repository_owner }}
GITHUB_REPOSITORY_NAME: ${{ github.event.repository.name }}

View File

@@ -5,7 +5,7 @@ on:
types: [opened]
jobs:
triage-issue:
claude-dedupe-issues:
runs-on: ubuntu-latest
timeout-minutes: 10
permissions:

View File

@@ -1,5 +1,47 @@
# Changelog
## 1.0.71
- Background commands: (Ctrl-b) to run any Bash command in the background so Claude can keep working (great for dev servers, tailing logs, etc.)
- Customizable status line: add your terminal prompt to Claude Code with /statusline
## 1.0.70
- Performance: Optimized message rendering for better performance with large contexts
- Windows: Fixed native file search, ripgrep, and subagent functionality
- Added support for @-mentions in slash command arguments
## 1.0.69
- Upgraded Opus to version 4.1
## 1.0.68
- Fix incorrect model names being used for certain commands like `/pr-comments`
- Windows: improve permissions checks for allow / deny tools and project trust. This may create a new project entry in `.claude.json` - manually merge the history field if desired.
- Windows: improve sub-process spawning to eliminate "No such file or directory" when running commands like pnpm
- Enhanced /doctor command with CLAUDE.md and MCP tool context for self-serve debugging
- SDK: Added canUseTool callback support for tool confirmation
- Added `disableAllHooks` setting
- Improved file suggestions performance in large repos
## 1.0.65
- IDE: Fixed connection stability issues and error handling for diagnostics
- Windows: Fixed shell environment setup for users without .bashrc files
## 1.0.64
- Agents: Added model customization support - you can now specify which model an agent should use
- Agents: Fixed unintended access to the recursive agent tool
- Hooks: Added systemMessage field to hook JSON output for displaying warnings and context
- SDK: Fixed user input tracking across multi-turn conversations
- Added hidden files to file search and @-mention suggestions
## 1.0.63
- Windows: Fixed file search, @agent mentions, and custom slash commands functionality
## 1.0.62
- Added @-mention support with typeahead for custom agents. @<your-custom-agent> to invoke it

View File

@@ -0,0 +1,263 @@
#!/usr/bin/env bun
declare global {
var process: {
env: Record<string, string | undefined>;
};
}
interface GitHubIssue {
number: number;
title: string;
user: { id: number };
created_at: string;
}
interface GitHubComment {
id: number;
body: string;
created_at: string;
user: { type: string; id: number };
}
interface GitHubReaction {
user: { id: number };
content: string;
}
async function githubRequest<T>(endpoint: string, token: string, method: string = 'GET', body?: any): Promise<T> {
const response = await fetch(`https://api.github.com${endpoint}`, {
method,
headers: {
Authorization: `Bearer ${token}`,
Accept: "application/vnd.github.v3+json",
"User-Agent": "auto-close-duplicates-script",
...(body && { "Content-Type": "application/json" }),
},
...(body && { body: JSON.stringify(body) }),
});
if (!response.ok) {
throw new Error(
`GitHub API request failed: ${response.status} ${response.statusText}`
);
}
return response.json();
}
function extractDuplicateIssueNumber(commentBody: string): number | null {
const match = commentBody.match(/#(\d+)/);
return match ? parseInt(match[1], 10) : null;
}
async function closeIssueAsDuplicate(
owner: string,
repo: string,
issueNumber: number,
duplicateOfNumber: number,
token: string
): Promise<void> {
await githubRequest(
`/repos/${owner}/${repo}/issues/${issueNumber}`,
token,
'PATCH',
{
state: 'closed',
state_reason: 'not_planned'
}
);
await githubRequest(
`/repos/${owner}/${repo}/issues/${issueNumber}/comments`,
token,
'POST',
{
body: `This issue has been automatically closed as a duplicate of #${duplicateOfNumber}.
If this is incorrect, please re-open this issue or create a new one.
🤖 Generated with [Claude Code](https://claude.ai/code)`
}
);
}
async function autoCloseDuplicates(): Promise<void> {
console.log("[DEBUG] Starting auto-close duplicates script");
const token = process.env.GITHUB_TOKEN;
if (!token) {
throw new Error("GITHUB_TOKEN environment variable is required");
}
console.log("[DEBUG] GitHub token found");
const owner = process.env.GITHUB_REPOSITORY_OWNER || "anthropics";
const repo = process.env.GITHUB_REPOSITORY_NAME || "claude-code";
console.log(`[DEBUG] Repository: ${owner}/${repo}`);
const threeDaysAgo = new Date();
threeDaysAgo.setDate(threeDaysAgo.getDate() - 3);
console.log(
`[DEBUG] Checking for duplicate comments older than: ${threeDaysAgo.toISOString()}`
);
console.log("[DEBUG] Fetching open issues created more than 3 days ago...");
const allIssues: GitHubIssue[] = [];
let page = 1;
const perPage = 100;
while (true) {
const pageIssues: GitHubIssue[] = await githubRequest(
`/repos/${owner}/${repo}/issues?state=open&per_page=${perPage}&page=${page}`,
token
);
if (pageIssues.length === 0) break;
// Filter for issues created more than 3 days ago
const oldEnoughIssues = pageIssues.filter(issue =>
new Date(issue.created_at) <= threeDaysAgo
);
allIssues.push(...oldEnoughIssues);
page++;
// Safety limit to avoid infinite loops
if (page > 20) break;
}
const issues = allIssues;
console.log(`[DEBUG] Found ${issues.length} open issues`);
let processedCount = 0;
let candidateCount = 0;
for (const issue of issues) {
processedCount++;
console.log(
`[DEBUG] Processing issue #${issue.number} (${processedCount}/${issues.length}): ${issue.title}`
);
console.log(`[DEBUG] Fetching comments for issue #${issue.number}...`);
const comments: GitHubComment[] = await githubRequest(
`/repos/${owner}/${repo}/issues/${issue.number}/comments`,
token
);
console.log(
`[DEBUG] Issue #${issue.number} has ${comments.length} comments`
);
const dupeComments = comments.filter(
(comment) =>
comment.body.includes("Found") &&
comment.body.includes("possible duplicate") &&
comment.user.type === "Bot"
);
console.log(
`[DEBUG] Issue #${issue.number} has ${dupeComments.length} duplicate detection comments`
);
if (dupeComments.length === 0) {
console.log(
`[DEBUG] Issue #${issue.number} - no duplicate comments found, skipping`
);
continue;
}
const lastDupeComment = dupeComments[dupeComments.length - 1];
const dupeCommentDate = new Date(lastDupeComment.created_at);
console.log(
`[DEBUG] Issue #${
issue.number
} - most recent duplicate comment from: ${dupeCommentDate.toISOString()}`
);
if (dupeCommentDate > threeDaysAgo) {
console.log(
`[DEBUG] Issue #${issue.number} - duplicate comment is too recent, skipping`
);
continue;
}
console.log(
`[DEBUG] Issue #${
issue.number
} - duplicate comment is old enough (${Math.floor(
(Date.now() - dupeCommentDate.getTime()) / (1000 * 60 * 60 * 24)
)} days)`
);
const commentsAfterDupe = comments.filter(
(comment) => new Date(comment.created_at) > dupeCommentDate
);
console.log(
`[DEBUG] Issue #${issue.number} - ${commentsAfterDupe.length} comments after duplicate detection`
);
if (commentsAfterDupe.length > 0) {
console.log(
`[DEBUG] Issue #${issue.number} - has activity after duplicate comment, skipping`
);
continue;
}
console.log(
`[DEBUG] Issue #${issue.number} - checking reactions on duplicate comment...`
);
const reactions: GitHubReaction[] = await githubRequest(
`/repos/${owner}/${repo}/issues/comments/${lastDupeComment.id}/reactions`,
token
);
console.log(
`[DEBUG] Issue #${issue.number} - duplicate comment has ${reactions.length} reactions`
);
const authorThumbsDown = reactions.some(
(reaction) =>
reaction.user.id === issue.user.id && reaction.content === "-1"
);
console.log(
`[DEBUG] Issue #${issue.number} - author thumbs down reaction: ${authorThumbsDown}`
);
if (authorThumbsDown) {
console.log(
`[DEBUG] Issue #${issue.number} - author disagreed with duplicate detection, skipping`
);
continue;
}
const duplicateIssueNumber = extractDuplicateIssueNumber(lastDupeComment.body);
if (!duplicateIssueNumber) {
console.log(
`[DEBUG] Issue #${issue.number} - could not extract duplicate issue number from comment, skipping`
);
continue;
}
candidateCount++;
const issueUrl = `https://github.com/${owner}/${repo}/issues/${issue.number}`;
try {
console.log(
`[INFO] Auto-closing issue #${issue.number} as duplicate of #${duplicateIssueNumber}: ${issueUrl}`
);
await closeIssueAsDuplicate(owner, repo, issue.number, duplicateIssueNumber, token);
console.log(
`[SUCCESS] Successfully closed issue #${issue.number} as duplicate of #${duplicateIssueNumber}`
);
} catch (error) {
console.error(
`[ERROR] Failed to close issue #${issue.number} as duplicate: ${error}`
);
}
}
console.log(
`[DEBUG] Script completed. Processed ${processedCount} issues, found ${candidateCount} candidates for auto-close`
);
}
autoCloseDuplicates().catch(console.error);
// Make it a module
export {};

View File

@@ -0,0 +1,198 @@
#!/usr/bin/env bun
declare global {
var process: {
env: Record<string, string | undefined>;
};
}
interface GitHubIssue {
number: number;
title: string;
state: string;
state_reason?: string;
user: { id: number };
created_at: string;
closed_at?: string;
}
interface GitHubComment {
id: number;
body: string;
created_at: string;
user: { type: string; id: number };
}
async function githubRequest<T>(endpoint: string, token: string, method: string = 'GET', body?: any): Promise<T> {
const response = await fetch(`https://api.github.com${endpoint}`, {
method,
headers: {
Authorization: `Bearer ${token}`,
Accept: "application/vnd.github.v3+json",
"User-Agent": "backfill-duplicate-comments-script",
...(body && { "Content-Type": "application/json" }),
},
...(body && { body: JSON.stringify(body) }),
});
if (!response.ok) {
throw new Error(
`GitHub API request failed: ${response.status} ${response.statusText}`
);
}
return response.json();
}
async function triggerDedupeWorkflow(
owner: string,
repo: string,
issueNumber: number,
token: string,
dryRun: boolean = true
): Promise<void> {
if (dryRun) {
console.log(`[DRY RUN] Would trigger dedupe workflow for issue #${issueNumber}`);
return;
}
await githubRequest(
`/repos/${owner}/${repo}/actions/workflows/claude-dedupe-issues.yml/dispatches`,
token,
'POST',
{
ref: 'main',
inputs: {
issue_number: issueNumber.toString()
}
}
);
}
async function backfillDuplicateComments(): Promise<void> {
console.log("[DEBUG] Starting backfill duplicate comments script");
const token = process.env.GITHUB_TOKEN;
if (!token) {
throw new Error(`GITHUB_TOKEN environment variable is required
Usage:
GITHUB_TOKEN=your_token bun run scripts/backfill-duplicate-comments.ts
Environment Variables:
GITHUB_TOKEN - GitHub personal access token with repo and actions permissions (required)
DRY_RUN - Set to "false" to actually trigger workflows (default: true for safety)
DAYS_BACK - How many days back to look for old issues (default: 90)`);
}
console.log("[DEBUG] GitHub token found");
const owner = "anthropics";
const repo = "claude-code";
const dryRun = process.env.DRY_RUN !== "false";
const daysBack = parseInt(process.env.DAYS_BACK || "90", 10);
console.log(`[DEBUG] Repository: ${owner}/${repo}`);
console.log(`[DEBUG] Dry run mode: ${dryRun}`);
console.log(`[DEBUG] Looking back ${daysBack} days`);
const cutoffDate = new Date();
cutoffDate.setDate(cutoffDate.getDate() - daysBack);
console.log(`[DEBUG] Fetching issues created since ${cutoffDate.toISOString()}...`);
const allIssues: GitHubIssue[] = [];
let page = 1;
const perPage = 100;
while (true) {
const pageIssues: GitHubIssue[] = await githubRequest(
`/repos/${owner}/${repo}/issues?state=all&per_page=${perPage}&page=${page}&since=${cutoffDate.toISOString()}`,
token
);
if (pageIssues.length === 0) break;
allIssues.push(...pageIssues);
page++;
// Safety limit to avoid infinite loops
if (page > 100) {
console.log("[DEBUG] Reached page limit, stopping pagination");
break;
}
}
console.log(`[DEBUG] Found ${allIssues.length} issues from the last ${daysBack} days`);
let processedCount = 0;
let candidateCount = 0;
let triggeredCount = 0;
for (const issue of allIssues) {
processedCount++;
console.log(
`[DEBUG] Processing issue #${issue.number} (${processedCount}/${allIssues.length}): ${issue.title}`
);
console.log(`[DEBUG] Fetching comments for issue #${issue.number}...`);
const comments: GitHubComment[] = await githubRequest(
`/repos/${owner}/${repo}/issues/${issue.number}/comments`,
token
);
console.log(
`[DEBUG] Issue #${issue.number} has ${comments.length} comments`
);
// Look for existing duplicate detection comments (from the dedupe bot)
const dupeDetectionComments = comments.filter(
(comment) =>
comment.body.includes("Found") &&
comment.body.includes("possible duplicate") &&
comment.user.type === "Bot"
);
console.log(
`[DEBUG] Issue #${issue.number} has ${dupeDetectionComments.length} duplicate detection comments`
);
// Skip if there's already a duplicate detection comment
if (dupeDetectionComments.length > 0) {
console.log(
`[DEBUG] Issue #${issue.number} already has duplicate detection comment, skipping`
);
continue;
}
candidateCount++;
const issueUrl = `https://github.com/${owner}/${repo}/issues/${issue.number}`;
try {
console.log(
`[INFO] ${dryRun ? '[DRY RUN] ' : ''}Triggering dedupe workflow for issue #${issue.number}: ${issueUrl}`
);
await triggerDedupeWorkflow(owner, repo, issue.number, token, dryRun);
if (!dryRun) {
console.log(
`[SUCCESS] Successfully triggered dedupe workflow for issue #${issue.number}`
);
}
triggeredCount++;
} catch (error) {
console.error(
`[ERROR] Failed to trigger workflow for issue #${issue.number}: ${error}`
);
}
// Add a delay between workflow triggers to avoid overwhelming the system
await new Promise(resolve => setTimeout(resolve, 1000));
}
console.log(
`[DEBUG] Script completed. Processed ${processedCount} issues, found ${candidateCount} candidates without duplicate comments, ${dryRun ? 'would trigger' : 'triggered'} ${triggeredCount} workflows`
);
}
backfillDuplicateComments().catch(console.error);
// Make it a module
export {};