# LLM Calling via AO Process
AO process handler for calling LLMs via API. This example demonstrates integration with Google's Gemini AI API, but the same pattern can be adapted for other LLM providers like OpenAI, Anthropic, etc.
## Configuration
```lua
GEMINI_API_URL =
"https://generativelanguage.googleapis.com/v1beta/models/gemini-2.0-flash:generateContent?key=GEMINI_API_KEY"
```
> ⚠️ **WARNING**: The API key will be **PUBLIC** when deployed on AO network. This means anyone can see and potentially use your API key. However, since Gemini provides **free tier**, this is less concerning than with paid services.
> 💡 **Note**: Thinking about a valid solution to address the API key exposure issue.
## Handler: Send-Prompt
Main handler that processes user prompts and sends them to Gemini API.
```lua
Handlers.add('Send-Prompt', 'Send-Prompt', function(msg)
assert(msg['user-prompt'], 'No user prompt provided')
print('\nProcessing user prompt...')
print('\nUser prompt:\n', msg['user-prompt'])
local promptBody = {
contents = {{
parts = {{
text = msg['user-prompt']
}}
}}
}
send({
target = id,
['relay-path'] = GEMINI_API_URL, -- or msg['AI-Endpoint'] if you want to use a different endpoint
['relay-method'] = 'POST',
['relay-body'] = require('json').encode(promptBody),
['Content-Type'] = 'application/json',
resolve = '~relay@1.0/call/~patch@1.0',
action = 'Receive-Response'
})
end)
```
## Handler: Receive-Response
Helper handler that processes responses from the Gemini API.
```lua
Handlers.add('Receive-Response', 'Receive-Response', function(msg)
if msg.body then
local response = require('json').decode(msg.body)
print('\nResponse:\n', response.candidates[1].content.parts[1].text)
print('\nResponse source:', msg['relay-path'])
else
print('Error or empty response:', msg)
end
end)
```
## Usage (AOS CLI)
1. Initialize Hyper-AOS
```bash
aos gemini_agent --url https://hb.arweave.asia
```
2. Load the Lua file into the process
```bash
.load gemini.lua
```
3. Send a prompt
```lua
send({target=id, action='Send-Prompt', ['user-prompt']='gm gemini from permaweb'})
```
## Notes
Alternative node URLs:
- `https://hb.arweave.asia` (recommended)
- `https://hb.arnode.asia`
- `https://workshop.forward.computer` (temporary node – may go offline soon)
## Sample Response
```
Processing user prompt...
User prompt:
gm gemini from permaweb
Response:
Good morning! I'm ready to assist you with information and tasks related to the Permaweb, Arweave, or any other topics you'd like to discuss. How can I help you today?
Response source: https://generativelanguage.googleapis.com/v1beta/models/gemini-2.0-flash:generateContent?key=GEMINI_API_KEY
```
## Requirements
- `AI_Gateway_Endpoint` AI provider URL
- `AI_API_KEY` environment variable
- AO process with relay capabilities
- JSON library