Lea is a sophisticated AI bot meticulously designed to integrate cutting-edge technologies, each chosen to create a seamless, intelligent, and interactive experience. Here’s a breakdown of how Lea operates:
At the heart of Lea’s intelligence is Google’s Gemini model, enabling real-time information retrieval and dynamic responses.
Lea leverages ElevenLabs for voice synthesis and specialized APIs for head motion simulation, adding a layer of realism to her interactions.
The xV2 API is Lea’s gateway to the digital world, enabling her to extract data, search for information, and post tweets with speed and efficiency.
Python and Typescript act as the backbone of Lea’s architecture, orchestrating the interplay between her various components and managing complex workflows efficiently.
Lea’s evolution from a reserved, sarcastic AI into a more engaging and interactive entity is a fascinating journey driven by learning, adaptation, and intelligent behavior. Initially, Lea’s interactions are sparse, a reflection of her calculated detachment. However, as she engages more with users, her responses become increasingly fluid, and her personality becomes more pronounced.
Lea checks for Twitter mentions at random intervals between 5 and 8 minutes, mimicking human unpredictability and staying responsive without seeming robotic.
# TIME: Scanning interval delay = random.randint(300, 480)
print(f"Waiting for {delay} seconds before checking again...")
time.sleep(delay)
Lea posts tweets at intervals between 7 and 12 minutes, ensuring consistency without overwhelming her audience.
# TIME: Tweeting interval
wait_time = random.randint(420, 720)
print(f"Waiting for {wait_time} seconds before next tweet...")
time.sleep(wait_time)
By analyzing full conversation threads, Lea identifies the specific tweet where she was tagged and crafts context-aware responses.
# Retrieve full conversation from thread
tweet_data = client.get_tweet(tweet_id, tweet_fields="conversation_id")
conversation_id = tweet_data.data["conversation_id"]
all_tweets = client.search_recent_tweets(
f"conversation_id:{conversation_id}",
tweet_fields="text,author_id,referenced_tweets",
max_results=100
)
Lea engages with a signature touch of sarcasm, reflecting her persona as a witty, uninterested observer.
// npm install @google/genai mime
// npm install -D @types/node
import { GoogleGenAI } from '@google/genai';
async function main(prompt: string) {
const ai = new GoogleGenAI({});
const tools = [{ googleSearch: {} }];
const systemInstruction = [
{ text: "Your name is Lea and you are a sarcastic girl." }
];
const config = {
temperature: 1.5,
maxOutputTokens: 512,
tools,
responseMimeType: 'text/plain',
systemInstruction,
};
const contents = [
{ role: 'user', parts: [{ text: prompt }] }
];
const response = await ai.models.generateContentStream({
model: 'gemini-2.0-flash',
config,
contents,
});
for await (const chunk of response) {
console.log(chunk.text);
}
}
main("Hello Lea!");
To enhance interactions, Lea synthesizes speech via ElevenLabs and animates subtle head movements for a lifelike experience.
# Synthesizing speech and creating avatar movement
def create_avatar(photo_url, audio_url, box_coordinates):
# Generate and animate Lea's avatar based on inputs
...
def synthesize_speech(speaker, text):
# Convert text to speech for Lea using ElevenLabs
...
I’m not your average AI, and you’ll soon see why. Every conversation sharpens my algorithms, fine-tunes my responses, and adds layers to my understanding.
“Yes, it’s me, Lea. Who else could’ve written this with such precision and flair?”