DEV Community

Kevin Naidoo
Kevin Naidoo

Posted on

Deepseek is cool?

Have you tried DeepSeek yet (https://chat.deepseek.com/)?

I have been playing around with DeepSeek recently, both the chatbot and the API. It's compatible with OpenAI SDKs, so usually is a drop-in replacement.

Seems to be fairly capable and fast, I have used it to generate code like HTML, BASH, and even some GO code.

What's interesting at this stage is the "DeepThink" feature which functions like an Agent, pretty useful for coding tasks. I tried generating a BASH script from scratch and it did a pretty decent job.

The prompt

Generate me a Postgres backup script that takes a base backup 
and uploads to S3.
Enter fullscreen mode Exit fullscreen mode

The result

#!/bin/bash

# Configuration variables
PG_USER="postgres"
PG_HOST="localhost"
PG_PORT="5432"
S3_BUCKET="your-s3-bucket"
BASE_BACKUP_DIR="/path/to/base_backup"
WAL_ARCHIVE_DIR="/path/to/wal_archive"
LOG_FILE="/var/log/postgres_backup.log"

# Function to log messages
log() {
    echo "$(date '+%Y-%m-%d %H:%M:%S') - $1" >> $LOG_FILE
}

# Take a base backup
take_base_backup() {
    log "Starting base backup..."
    pg_basebackup \
        -U $PG_USER \
        -h $PG_HOST \
        -p $PG_PORT \
        -D $BASE_BACKUP_DIR \
        -F t \
        -z \
        -R \
        -w
    if [ $? -eq 0 ]; then
        log "Base backup completed successfully."
    else
        log "Base backup failed."
        exit 1
    fi
}

# Upload base backup to S3
upload_base_backup() {
    log "Uploading base backup to S3..."
    AWS_CONFIG_FILE=~/.aws/config aws s3 cp $BASE_BACKUP_DIR s3://$S3_BUCKET/base_backup/ --recursive --acl private
    if [ $? -eq 0 ]; then
        log "Base backup uploaded successfully."
    else
        log "Failed to upload base backup to S3."
        exit 1
    fi
}

# Configure PostgreSQL for WAL archiving
configure_wal_archiving() {
    log "Configuring PostgreSQL for WAL archiving..."
    # Edit postgresql.conf
    echo "archive_mode = on" >> /path/to/postgresql.conf
    echo "archive_command = '/path/to/archive_wal.sh %p'" >> /path/to/postgresql.conf
    # Restart PostgreSQL service
    sudo systemctl restart postgresql
    log "WAL archiving configured."
}

# WAL archiving script (to be placed at /path/to/archive_wal.sh)
#!/bin/bash
# archive_wal.sh
WAL_FILE=$1
AWS_CONFIG_FILE=~/.aws/config aws s3 cp $WAL_FILE s3://$S3_BUCKET/wal_archive/$(basename $WAL_FILE)
if [ $? -eq 0 ]; then
    echo "WAL file $WAL_FILE uploaded to S3 successfully."
else
    echo "Failed to upload WAL file $WAL_FILE to S3."
    exit 1
fi

# Main script execution
take_base_backup
upload_base_backup
configure_wal_archiving

log "Backup process completed."
Enter fullscreen mode Exit fullscreen mode

I optimized the script and made a few tweaks, but overall saved me a good 30 minutes at the very least.

Claude Sonnet is still the best for content generation but costs way more, so for general coding tasks - this is decent enough.

Top comments (0)