Ryan Scott Brown

I build cloud-based systems for startups and enterprises. My background in operations gives me a unique focus on writing observable, reliable software and automating maintenance work.

I love learning and teaching about Amazon Web Services, automation tools such as Ansible, and the serverless ecosystem. I most often write code in Python, TypeScript, and Rust.

B.S. Applied Networking and Systems Administration, minor in Software Engineering from Rochester Institute of Technology.

    TIL: Prompts Generating Prompts

    Using ChatGPT I’ve started adding a tail onto prompts to generate better prompts. For example:

    How might I set use custom metrics for my applications in AWS? Compare planning to use CloudWatch Embedded Metric Format (EMF), calling the CloudWatch API from my app, and using the CloudWatch agent.

    Also, how could I have asked this better to get a clearer answer?

    This gets a bit more out of the model, and I’ve found feeding the response back as a prompt has good results. Sometimes it misses the instruction for the suffix, and I haven’t found a pattern for that.

    I tried this, but it didn’t work:

    How might I set use custom metrics for my applications in AWS?

    Come up with a better way to phrase this question, then answer that instead.

    The model seems to just answer the original question.

    Design by Sam Lucidi (samlucidi.com)