Cluster Super Metrics for Datastore Specific Capacity
- Brock Peterson
- Sep 7
- 1 min read
I had a Cluster with seven supporting Datastores and I wanted Capacity metrics for just those with certain names, here's how I did it.
I created three Super Metrics:
One for the total Datastore capacity, it looked like this. You can change the contains field to be whatever you want, the Datastores I wanted had "vcf-ops" in their name.
sum({‌Datastore: ‌Disk Space|Total Capacity, depth=3, where=({metric=‌: ‌Configuration|Name} contains 'vcf-ops' )})
One for Datastore capacity used, it looked like this.
sum({‌Datastore: ‌Disk Space|Utilization, depth=3, where=({metric=‌: ‌Configuration|Name} contains 'vcf-ops' )})
To get the % used, I just took the ratio and multiplied by 100.
({‌This Resource: ‌Super Metrics|Datastore Total Disk Space Used for Cluster}/{‌This Resource: ‌Super Metrics|Datastore Total Disk Space for Cluster})*100
If you want these Super Metrics, they can be found here.
Looking at the Cluster in question, I have the following.

Total Datastore Disk Space of 55,267GB, Used Datastore Disk Space of 18,682GB, for a Percentage Used of 33%.
Looking at the Datastores supporting the Cluster in question we have the following Total Capacity values, which totals to 55,267GB.

Checking the Datastores support the Cluster in question we have the following Utilization values, which total to 18,682GB.

Finally, (18,682 / 55,267 ) x 100 = 33%, which is what we expected. This is a good way to capture Datastore-specific capacity metrics from your Cluster, just change the contains value in the Super Metrics to meet your needs, enjoy!