More Prompts:

Best prompts for ChatGPT for coding using automating repetitive tasks

12 practical, copy-ready prompts to automate repetitive coding tasks: scripts, CI/CD, codemods, migrations, server provisioning, and repo maintenance. Each entry includes a clear title, a concise explanation, a copyable prompt, a realistic example, and suggested AIs that work best.

GPT-5
Claude Opus 4
Claude Sonnet 4
Gemini 2.5 Flash
Gemini 2.5 Pro
You know that sinking feeling when you're staring at the same repetitive coding task for the third time this month, thinking "there has to be a better way to automate this." You fire up ChatGPT with a vague prompt like "write a script to clean up my files" and get back generic code that breaks the moment you try to use it in your real environment. What if you had battle-tested prompts that actually worked the first time?
This collection gives you 12 copy-ready prompts that generate production-quality automation scripts for the tasks that eat up your time every week. From batch file operations and database migrations to CI/CD workflows and server provisioning, each prompt is designed to produce robust, real-world code with error handling, logging, and safety features built in. Instead of spending hours debugging generic AI output, you'll have working automation tools that save you time and prevent the mistakes that come with repetitive manual work.
1
Batch-rename files with regex (Python)
Write a complete Python 3 script named batch_rename.py that: - Accepts CLI args: --dir (directory), --pattern (regex), --replacement (replacement string using re.sub syntax), --dry-run (flag), --map-file (path to write JSON map of old->new names), --ignore-case (flag). - Recursively finds files in --dir, computes new names using re.sub(pattern, replacement), and shows a preview table. - On confirmation, performs atomic renames, writes the mapping JSON to --map-file, and supports an --undo flag that reads the map and restores original names. - Logs all operations and handles name collisions safely (skip and log collision). - Include usage examples and unit tests for the core rename function. Produce the full script code and a short README section explaining usage and safety notes.
Generate a safe, idempotent Python script that previews and then batch-renames files in a directory using a regular expression. Includes dry-run, logging, and undo support (mapping file).
2
CSV-to-JSON S3 uploader CLI (Node.js)
Create a Node.js (ESM) CLI project scaffold named csv2s3 with a bin script csv2s3.js that: - Accepts args: --input <glob>, --bucket <s3-bucket>, --prefix <s3-prefix>, --schema <json-schema-file>, --concurrency <n>, --dry-run. - Uses fast-csv or csv-parse to convert CSV rows to JSON, validates against the provided JSON Schema (ajv), and uploads files in batches to S3 using AWS SDK v3 with configurable concurrency and exponential backoff. - Produces detailed summary stats (rows processed, failures, uploaded objects, bytes) and writes a log file. - Includes package.json, example usage, and unit tests for parsing and validation. Return full code for csv2s3.js, example package.json, and README usage examples.
Create a Node.js CLI tool that converts CSV files to JSON objects, validates schema, batches uploads to S3, and supports parallel processing and retry logic.
3
GitHub Actions workflow for test/build/deploy
Write a GitHub Actions workflow (.github/workflows/ci-cd.yml) that: - Triggers on push to main and on pull_request to main. - Runs a matrix job for node versions [16,18] and OS [ubuntu-latest, macos-latest]. - Steps: checkout, cache node modules, install, run linter, run tests with coverage, build artifact, upload artifact to workflow, and on push to main deploy to a staging server via SSH (use secrets: STAGING_HOST, STAGING_USER, STAGING_SSH_KEY). - Includes conditional steps: only deploy on push to main, only upload coverage when tests succeed, and uses concurrency group to avoid parallel deploys. Provide the full YAML and a short explanation of required repo secrets and how to adapt for other infra (e.g., S3 or Docker image push).
Generate a reusable GitHub Actions workflow that runs tests, builds artifacts, and deploys to a staging server on push to main, with matrix testing and secrets handling.
4
Postgres migration: normalize column names across tables
Generate a SQL migration (up and down) plus a Python verification script that: - Targets Postgres and renames columns whose names match a provided mapping, e.g., {"usr_id": "user_id", "createdAt": "created_at"}. - Preserves constraints, indexes, foreign keys by generating appropriate ALTER TABLE statements and recreating dependent objects if necessary. - Includes BEGIN/COMMIT, transactional safety, and a rollback (down) migration that restores old names. - Provides a Python script verify_schema.py that connects to the DB, verifies all target tables have the new names, and optionally checks row counts remained unchanged. - Include guidance on running in production with minimal locking (use pg_repack recommendations if appropriate). Return the SQL migration, verify_schema.py, and short run instructions.
Produce a safe Postgres migration script and a verifying script that renames columns to a canonical naming scheme across multiple tables, with rollback and tests.
5
Codemod to replace deprecated API calls (jscodeshift)
Write a jscodeshift transform script replace_api.js that: - Accepts a mapping from deprecated identifiers to new identifiers and optionally a new module path, e.g., {"oldApi": {"newName": "newApi", "newModule": "@new/lib"}}. - Finds occurrences of oldApi as named imports, default imports, and property access (obj.oldApi), replaces them with the new API, and updates import sources when newModule is provided. - Preserves comments and formatting, and emits a summary of changed files and nodes. - Include instructions on running the transform: npx jscodeshift -t replace_api.js src --parser=tsx and examples of mappings. Provide the full transform code and two concrete mapping examples.
Produce a jscodeshift transform that replaces deprecated function calls and updates imports across a JavaScript/TypeScript repo, handling named vs default imports and preserving comments.
6
Compress & archive logs older than N days (Bash + rsync)
Produce a Bash script rotate_logs.sh that: - Accepts args: --dir, --days, --archive-dir, --remote (user@host:/path), --compress (gz|xz), --dry-run. - Finds files in --dir older than --days, groups them by date, compresses each group into a timestamped archive in --archive-dir, enforces a max number of archives to keep, and then rsyncs new archives to --remote with retry/backoff. - Writes a manifest JSON with archive contents and size, supports verbose and quiet modes, and exits non-zero on unrecoverable errors. - Handles filenames with spaces/newlines and uses flock or lockfile for single-run safety. Return the full script with usage examples and a small systemd timer example to run weekly.
Create a robust Bash script to find log files older than N days, compress them into dated archives, rotate archives, and rsync to remote storage with retries.
7
Auto-generate pytest unit tests for a module (Python)
Write a Python script gen_tests.py that: - Accepts CLI args: --module (dotted import path), --output-dir, --max-examples (for Hypothesis), and --include-hypothesis (flag). - Uses inspect to discover functions and methods in the module, extracts type hints and docstrings, and generates pytest test files with parameterized test cases and Hypothesis strategies when type hints are available. - Includes safe placeholders for external dependencies (use monkeypatch examples), and marks generated tests with TODO comments where human review is required. - Provide example generated tests for a sample module utilities.py with 4 functions. Return the generator script and the sample generated tests.
Create a script that inspects a Python module and scaffolds pytest unit tests (happy path, edge cases, type checks) using simple stubs and Hypothesis property-based tests where applicable.
AI Flow Chat

Stop Losing Your AI Work

Tired of rewriting the same prompts, juggling ChatGPT and Claude in multiple tabs, and watching your best AI conversations disappear forever?

AI Flow Chat lets you save winning prompts to a reusable library, test all models in one workspace, and convert great chats into automated workflows that scale.

Teach World Class AI About Your Business, Content, Competitors… Get World Class Answers, Content, Suggestions...
AI Flow Chat powers our entire content strategy. We double down on what’s working, extract viral elements, and create stuff fast.
Video thumbnail

Reference Anything

Bring anything into context of AI and build content in seconds

YouTube

PDF

DOCX

TikTok

Web

Reels

Video Files

Twitter Videos

Facebook/Meta Ads

Tweets

Coming Soon

Audio Files

Coming Soon

Choose a plan to match your needs

Upgrade or cancel subscriptions anytime. All prices are in USD.

Basic

For normal daily use. Ideal for getting into AI automation and ideation.

$30/month
  • See what Basic gets you
  • 11,000 credits per month
  • Access to all AI models
  • 5 app schedules
  • Free optional onboarding call
  • 1,000 extra credits for $6
Get Started

No risk, cancel anytime.

ProRecommended

For power users with high-volume needs.

$100/month
  • See what Pro gets you
  • 33,000 credits per month
  • Access to all AI models
  • 10 app schedules
  • Remove AI Flow Chat branding from embedded apps
  • Free optional onboarding call
  • 2,000 extra credits for $6
Get Started

No risk, cancel anytime.

Frequently Asked Questions

Everything you need to know about AI Flow Chat. Still have questions? Contact us.