Skip to main content
ScriptAI Infrastructurev1.1

Serverless Model Deploy

by Community · open-source · Last verified 2026-03-17

Packages a trained ML model into a serverless function on AWS Lambda, Modal, or Google Cloud Run, handling cold-start optimization, dependency layering, and auto-scaling configuration. Includes health-check endpoints, structured logging, and a GitHub Actions workflow for automated rollout.

https://github.com/modal-labs/modal-examples
C+
C+Average
Adoption: B+Quality: AFreshness: ACitations: C+Engagement: F

Specifications

License
MIT
Pricing
open-source
Capabilities
cold-start-optimization, auto-scaling, health-checks, ci-cd-integration
Integrations
modal, aws-lambda, cloud-run, github-actions, docker
Use Cases
low-traffic-inference, event-driven-ml, cost-optimized-serving
API Available
No
Language
python
Dependencies
modal, boto3, fastapi, uvicorn, docker
Environment
Python 3.10+, Docker
Est. Runtime
Deploy: 3-10 minutes
Tags
serverless, lambda, modal, deployment, mlops
Added
2026-03-17
Completeness
100%

Index Score

59
Adoption
72
Quality
82
Freshness
88
Citations
55
Engagement
0

Explore the full AI ecosystem on Agents as a Service