WebSocket Streaming API
by Community · open-source · Last verified 2026-03-17
WebSocket server that proxies token-by-token LLM streaming to multiple simultaneous clients, with connection lifecycle management, heartbeat keep-alives, and per-session context persistence. Supports fan-out broadcasting for collaborative AI sessions and reconnection with message replay.
https://github.com/encode/starlette ↗C+
C+—Average
Adoption: B+Quality: AFreshness: ACitations: C+Engagement: F
Specifications
- License
- MIT
- Pricing
- open-source
- Capabilities
- token-streaming, fan-out-broadcast, session-persistence, auto-reconnect
- Integrations
- fastapi, starlette, openai, redis, asyncio
- Use Cases
- real-time-chat, collaborative-writing, live-analytics-commentary
- API Available
- Yes
- Language
- python
- Dependencies
- fastapi, starlette, uvicorn, openai, redis, websockets
- Environment
- Python 3.11+
- Est. Runtime
- Latency: <50ms first token
- Tags
- websocket, streaming, real-time, llm-streaming, fastapi
- Added
- 2026-03-17
- Completeness
- 100%
Index Score
58.4Adoption
70
Quality
83
Freshness
88
Citations
55
Engagement
0