v1.2.0 Release v1.2.0 What's changed Added features New LLM wrapper: VLLM for local inference with batches Full Changelog: here