Build Your Own AI “Politeness Converter” for Pennies with Amazon Bedrock

We have all been there. You receive an email that makes your blood boil. You type out a furious reply. You hover over the “Send” button… and then you realize you enjoy being employed.

Usually, you would have to delete it and rewrite it manually. But today, we are going to build a serverless API that takes your angry rant and converts it into “Professional Corporate Speak” using Amazon Bedrock.

The best part? We are going to use the Claude 3 Haiku model, which is so cost-effective that you could convert thousands of angry emails for less than the price of a gumball.

The Architecture: Simple & Serverless

We don’t need a massive server. We just need a tiny function that acts as a bridge to the brain of the AI.

  • Compute: AWS Lambda (Python 3.12)
  • AI Model: Amazon Bedrock (Claude 3 Haiku)
  • Trigger: Lambda Function URL (for a quick HTTP endpoint)

Step 1: Accessing the Brain (Bedrock)

First, head to the AWS Console and search for Amazon Bedrock.

  1. Go to Model Access in the sidebar.
  2. Request access to Anthropic -> Claude 3 Haiku. (It usually grants access instantly).
  3. Note the Model ID: anthropic.claude-3-haiku-20240307-v1:0.

Why Haiku? It is blazing fast and incredibly cheap. It costs roughly $0.00025 per 1,000 input tokens. You can barely measure the cost of this experiment.

Step 2: The Permissions (IAM)

Your Lambda function needs permission to talk to Bedrock. Create an IAM Role with a policy that allows: bedrock:InvokeModel

Pro Tip for Community Builders: strict scoping is better, but for a sandbox demo, ensuring your Lambda can actually reach the model is step one.

Step 3: The Code

Create a Lambda function and paste this in. We are using boto3 to send our text to Bedrock.

Python

import json
import boto3

# Create the client
bedrock = boto3.client(service_name='bedrock-runtime', region_name='us-east-1')

def lambda_handler(event, context):
    # 1. Parse the input (your angry email)
    body = json.loads(event.get('body', '{}'))
    angry_text = body.get('text', 'I am annoyed.')

    # 2. Prepare the prompt for the AI
    prompt = f"Rewrite the following text to be professional, polite, and corporate-friendly:\n\n{angry_text}"

    # 3. Construct the payload for Claude 3
    payload = {
        "anthropic_version": "bedrock-2023-05-31",
        "max_tokens": 1000,
        "messages": [
            {
                "role": "user",
                "content": [{"type": "text", "text": prompt}]
            }
        ]
    }

    # 4. Call Bedrock
    response = bedrock.invoke_model(
        modelId="anthropic.claude-3-haiku-20240307-v1:0",
        contentType="application/json",
        accept="application/json",
        body=json.dumps(payload)
    )

    # 5. Parse the result
    result = json.loads(response['body'].read())
    polite_text = result['content'][0]['text']

    return {
        'statusCode': 200,
        'body': json.dumps({'original': angry_text, 'polite_version': polite_text})
    }

Step 4: The Test

I sent a CURL request with the text:

“This code is garbage and nothing works. Fix it now.”

The AI responded:

“I have reviewed the current codebase and identified some areas for improvement. I would appreciate it if we could address these issues promptly to ensure functionality.”

Why This is a “Community Builder” Level Project

Anyone can use ChatGPT in a browser. But building an integration using the SDK shows you understand how to build applications on top of AI, not just use them as a chatbot.

Plus, by choosing Claude 3 Haiku over the larger models, you demonstrate Cost Optimization—a pillar of the AWS Well-Architected Framework. You aren’t just burning credits; you are engineering a solution.

So go ahead. Build the tool. Save your career. And welcome to the world of Generative AI on AWS.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.