IVM - tracking average days to remediation

I am currently using an SQL query to generate a CSV as a report in IVM which shows me age in days of each vulnerability. This report takes 30mins even when reporting on a small number of assets and fails to complete for a large number of assets. Does anyone know if age in days is available via the dashboad or API or a better way to do this?
thanks

the sql for your reference:

SELECT site_id, dv.vulnerability_id, dv.date_published, fava.age_in_days, dv.severity, dv.exploits, dv.malware_kits, array_to_string(array_agg((ip_address) || (CASE WHEN host_name IS NULL THEN ‘’ ELSE ’ (’ || host_name || ‘)’ END)), ', ') AS affected_assets,
Case

When dv.severity = ‘Critical’ AND dv.exploits != ‘0’ AND dv.malware_kits >= ‘1’ AND dv.date_published < (NOW() - INTERVAL ‘15 days’) then ‘Emergency’
When dv.severity = ‘Critical’ OR (dv.severity=‘Severe’ AND date_published < (NOW() - INTERVAL ‘31 days’)) then ‘Severe’
WHEN dv.severity=‘Moderate’ AND dv.date_published > (NOW() - INTERVAL ‘60 days’) then ‘Moderate’
Else ‘Low’ End as VULN_SEVERITY,

Case
When dv.severity = ‘Critical’ AND dv.exploits != ‘0’ AND dv.malware_kits >= ‘1’ AND dv.date_published < (NOW() - INTERVAL ‘31 days’) then ‘1’
When dv.severity = ‘Critical’ OR (dv.severity=‘Severe’ AND date_published < (NOW() - INTERVAL ‘31 days’)) then ‘2’
WHEN dv.severity=‘Moderate’ AND dv.date_published > (NOW() - INTERVAL ‘60 days’) then ‘4’
Else ‘5’ End as VULN_SEVERITY_SORT, ds.summary
FROM fact_asset_vulnerability_age fava
JOIN fact_asset_vulnerability_finding USING (asset_id, vulnerability_id)
JOIN dim_vulnerability dv USING (vulnerability_id)
JOIN dim_asset da USING (asset_id)
JOIN dim_site_asset USING (asset_id)
JOIN dim_asset_vulnerability_best_solution davbs ON (davbs.asset_id = da.asset_id) AND (davbs.vulnerability_id = dv.vulnerability_id)
JOIN dim_solution ds USING (solution_id)
group by dim_site_asset.site_id, dv.vulnerability_id, dv.date_published, fava.age_in_days, dv.severity, dv.exploits, dv.malware_kits, VULN_SEVERITY, VULN_SEVERITY_SORT, ds.summary
ORDER by VULN_SEVERITY_SORT

I did some testing with your query and see what you mean regarding it taking awhile to run. To get some more context here, what’s your overall goal with this report? Based on the output, it seems like you’re looking to identify the most critical vulnerabilities based on things like severity, exploits/malware kits, and when it was published.

If that’s the case, my initial thought is that this could be a good opportunity to use remediation projects, since that would allow you to dynamically scope your efforts and the vulns you want to tackle upfront based on some of the same factors. But I’d be interested in learning more about what you’re looking to accomplish.

Hi Holly , thanks for posting. my main aim is to get an average number of days to remediation for a group of assets. I have seen the “discovered date” in the IVM query builder (ie. outside of this sql export report) but I can’t seem to query it - there is no keywords that references it.
Then secondary I would like to group the average days to remediate by vuln severity. So how many days to remediate critical vulns. Which is what we are trying to get done in under 2 weeks.

So the query builder does have a field called finding.firstFound, which refers to the first time an instance of a vulnerability was found. And there is the option of providing a relative timeframe with that filter (say, show me everything that was found in the past 15 days). The discovery date is also an optional column you can display in the query builder results via the Manage Columns option. Combining that with a severity filter would allow you to see which assets, vulnerabilities, and solutions exist for those criteria. It’s not quite as flexible as SQL queries, but it does make it easy to filter for the most critical vulnerabilities, and you could export that report if you want to do some calculations.

Another option that might be good if you’d prefer to stick with SQL reports is the data warehouse. With this, you’re able to export data from your VM console into an external warehouse and access the data there. This can be really good for quicker reports where you want to pull large amounts of data since it won’t be hitting your console.

We’ve got some info on configuring the data warehouse as well as the schema for it so you can see all the data that’s available. Lots of folks have done this in similar situations where they want to run larger reports, so it could be a good option here.