Skip to content

feat(vllm): add vLLM module#886

Open
Josephrp wants to merge 3 commits intotestcontainers:mainfrom
Josephrp:vllm
Open

feat(vllm): add vLLM module#886
Josephrp wants to merge 3 commits intotestcontainers:mainfrom
Josephrp:vllm

Conversation

@Josephrp
Copy link
Copy Markdown

solves #885

includes complete vLLM api , can be used for generation, embeddings , finetuning , dataset creation , testing experiment configurations , rl , and testing performance , ... etc.

@alexanderankin
Copy link
Copy Markdown
Member

take a look at the example of a database container or the ollama container. they are intentionally minimal, giving you only what you need to get started. the purpose of this library is to provide temporary test infrastructure, not to implement every feature. if you can get this to the same amount of lines as ollama or less i may consider merging. looks like a promising project. if it implements the same api as ollama, maybe even lets consider making one container work for both images. not sure if this is possible/feasible.

@Josephrp
Copy link
Copy Markdown
Author

Josephrp commented Oct 3, 2025

i see why you're saying this , and i understand where you're coming from , however ... it will be easier for me to show you the necessity & use of a fully featured object with a demo , so i'll do that in the next two weeks as i test , use and build with testcontainers . Meanwhile i'll dig into your comments and see if i can get it to where you're saying it should be :-)

@kiview
Copy link
Copy Markdown
Member

kiview commented Oct 6, 2025

@Josephrp We are currently working on adding vLLM support into Docker Model Runner. So I think a much better approach for the project is adding integration with Docker Model Runner (similar to what Java and Go already did), which should also help with giving GPU access on e.g. macOS hosts (since we run inference natively).

https://java.testcontainers.org/modules/docker_model_runner/

@ericcurtin
Copy link
Copy Markdown

ericcurtin commented Oct 6, 2025

Thanks for highlighting this @kiview . Yes we are actively working on this, it is not ready, but please feel free to collaborate with us on this:

https://github.com/docker/inference-engine-vllm

vllm-project/vllm#26160

@Tranquility2 Tranquility2 changed the title Feat : vLLM - adds object , adds tests , container , init , dataclasses feat(vllm): add vLLM module Apr 3, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants