Skip to content

Support AWS G7e (RTXPRO6000) instances#3752

Open
jvstme wants to merge 1 commit intomasterfrom
aws_g7e
Open

Support AWS G7e (RTXPRO6000) instances#3752
jvstme wants to merge 1 commit intomasterfrom
aws_g7e

Conversation

@jvstme
Copy link
Copy Markdown
Collaborator

@jvstme jvstme commented Apr 7, 2026

$ dstack offer -b aws --gpu RTXPRO6000 --group-by gpu,backend,region                     

 #  GPU                   SPOT             $/GPU          BACKEND  REGION      
 1  RTXPRO6000:96GB:1..8  spot, on-demand  0.5245..4.143  aws      us-east-2   
 2  RTXPRO6000:96GB:1..8  spot, on-demand  0.6513..4.143  aws      us-west-2   
 3  RTXPRO6000:96GB:1..8  spot, on-demand  1.0129..4.143  aws      us-east-1

Tested:

  • Spot and on-demand
  • Single-GPU and multi-GPU
  • nvidia-smi
  • PyTorch

@jvstme jvstme requested a review from peterschmidt85 April 7, 2026 18:25
@peterschmidt85
Copy link
Copy Markdown
Contributor

Lets double-check that EFA works (or at least theoretically from the code) for

g7e.8xlarge
g7e.12xlarge
g7e.24xlarge
g7e.48xlarge

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants