Photo by Jéan Béller on Unsplash

Creating MacOS-Agent Part 1: Use LLM to Automate Your MacOS

Sarin Suriyakoon
3 min readJan 26, 2024

--

I have been finding a way to automate my macOS that is easier than remembering and writing tons of bash scripts and commands that I google my life out every other day.

One day I just came up with the idea. How about putting LLM(such as ChatGPT, Claude, and Bard )and Bash Script together?

In this first of the series, we are going to explore the barebone code of this bash script which is enough to handle simple automation tasks.

If you are new to Bash Script, please check out this Getting Started with Bash Script first.

Raw Idea

The component of this idea is simple.

  1. Use AppleScript or in practical language it is osascript command.
  2. Use Gen AI LLM to generate the correct command from the user's human language input like “Open Notes”, “Write this file”, etc etc

Viola! We can now automate the MacOS.

LLM of Choice

I am using Claude through AWS Bedrock in this example. Yes, you can also use others as well such as ChatGPT and Bard. The effective prompt used might be a little different but the same pattern should suffice.

Step by Step

  1. Get user input
read -p "Enter command: " input

2. Constructing a prompt for Claude

prompt="\n\nHuman: Claude, please analyze the input text: '$input', \
and provide only the text that would fit directly into an osascript -e command to perform this action. \
DO NOT INCLUDE osascript -e itself and DO NOT return whitespace in front of the answer and DO NOT RETURN new line \
DO NOT add string escape for slash or double qoute or single qoute at all \
\n\nAssistant:"

3. Call LLM

response=$(aws bedrock-runtime invoke-model \
--model-id anthropic.claude-v2 \
--region us-east-1 \
--body "{\"prompt\": \"$prompt\", \"max_tokens_to_sample\" : 300, \"temperature\": 0.1}" \
--cli-binary-format raw-in-base64-out \
invoke-model-output.txt)

In this case, you could reduce, the creativity of the model and nudge the parameter to the more analytic behavior. In Claude's case, I choose to reduce temperature to almost zero( 0.1 ).

Also, I am using AWS CLI Becrock-Runtime

3. Extract the response

extractCMD=$(cat invoke-model-output.txt | jq -r .completion)
echo ${extractCMD} > extractCMD.applescript

Be careful to add extra -r for jq because the result will be different if you forgot

4. Run it with osascript command

osascript -e "${extractCMD}"
# osascript extractCMD.applescript // without -e, read from file instead

I leave two options open here, just in case I want to explore more.

Let’s check out the whole script

#!/bin/bash

read -p "Enter command: " input
prompt="\n\nHuman: Claude, please analyze the input text: '$input', \
and provide only the text that would fit directly into an osascript -e command to perform this action. \
DO NOT INCLUDE osascript -e itself and DO NOT return whitespace in front of the answer and DO NOT RETURN new line \
DO NOT add string escape for slash or double qoute or single qoute at all \
\n\nAssistant:"
response=$(aws bedrock-runtime invoke-model \
--model-id anthropic.claude-v2 \
--region us-east-1 \
--body "{\"prompt\": \"$prompt\", \"max_tokens_to_sample\" : 300, \"temperature\": 0.1}" \
--cli-binary-format raw-in-base64-out \
invoke-model-output.txt)

extractCMD=$(cat invoke-model-output.txt | jq -r .completion)
echo ${extractCMD} > extractCMD.applescript

osascript -e "${extractCMD}"
# osascript extractCMD.applescript // without -e, read from file instead

Run It

If you name the script chatbotautomate.sh don’t forget to chmod +x chatbotautomate.sh and run it with ./chatbotautomate.sh and there will be an input for you to enter try “Open Notes” or “Open google.com”

Next step

If you start to experiment with it, you can tell, there is still a limitation to this script, if we start putting a series of commands in one go, it might not execute what we want.

We will talk about more advanced thinking and coding(more iteration, breaking complexity down, bash run bash) in the next article…stay tuned by following me

--

--