Google has reduced the daily request limits for its free API tier, including Gemini 2.5 Flash. For those using AI-generated video doorbell descriptions in Home Assistant, this change may cause issues.
To avoid automation failures, try these alternatives:
1. **Switch to a different model**: Some models still have reasonable daily limits, such as the Gemini Robotics-ER 1.5 Preview (250 requests per day). While it’s intended for robotics, you can use it for other purposes like snarky doorbell descriptions.
2. **Use GroqCloud**: Models like meta-llama/llama-4-maverick-17b-128e-instruct have higher limits (1,000 requests or 500,000 tokens per day). Although these limits may be cut in the future, they’re sufficient for most users.
3. **Host a local LLM**: Running your own local LLM can ensure control and avoid rate limit issues. While it requires more hardware, consumer-grade GPUs like the RTX 3060 can handle medium-sized VLMs.
4. **Pay for what you use**: API costs are relatively low ($0.30-2.50 per million tokens), making it a viable option. You can also access multiple models from different providers using aggregators like OpenRouter.
These alternatives will help you maintain AI-generated video doorbell descriptions in Home Assistant despite the reduced free tier limits.
Source: https://www.howtogeek.com/gemini-slashed-free-api-limits-what-to-use-instead