Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more
Hi all,
I think this is really a Power Query question but ultimately i'll be using this in Power BI. Scratching my head with this one and would appreciate some advice. I have a dataset of up/down time periods for approximately 15 websites. See an illustrative example at the bottom of this post. I need to report the % of uptime for each website but with this data set - see 'Desired outcome' below. I'm struggling to get what I want. I feel I'm close with a combination of filtering to only 'down' periods, adding an 'End of month' column (named 'created_at_EOM' in the example below), then grouping the data by website_name, date_created, duration (aggregated as a sum). Some problems:
Thanks in advance on any input!
(The blanks are becuase of the most recent period, but I can easily fix those by inputting the current time with 'DateTime.LocalNow')
TestID | WebsiteName | status | created_at | ended_at | duration |
1 | Website 1 | up | 2022-03-24T09:38:17+00:00 | 2022-04-03T09:28:43+00:00 | 863426000 |
1 | Website 1 | down | 2022-04-03T09:28:43+00:00 | 2022-04-03T09:30:10+00:00 | 87000 |
1 | Website 1 | up | 2022-04-03T09:30:10+00:00 | 2022-05-02T08:06:09+00:00 | 2500559000 |
1 | Website 1 | down | 2022-05-02T08:06:09+00:00 | 2022-05-02T08:07:29+00:00 | 80000 |
1 | Website 1 | up | 2022-05-02T08:07:29+00:00 | 2022-05-29T10:29:54+00:00 | 2341345000 |
1 | Website 1 | down | 2022-05-29T10:29:54+00:00 | 2022-05-29T10:31:07+00:00 | 73000 |
1 | Website 1 | up | 2022-05-29T10:31:07+00:00 | 2023-01-15T13:40:08+00:00 | 19969741000 |
1 | Website 1 | down | 2023-01-15T13:40:08+00:00 | 2023-01-15T13:41:21+00:00 | 73000 |
1 | Website 1 | up | 2023-01-15T13:41:21+00:00 | 2023-01-15T16:13:06+00:00 | 9105000 |
1 | Website 1 | down | 2023-01-15T16:13:06+00:00 | 2023-01-15T16:14:17+00:00 | 71000 |
1 | Website 1 | up | 2023-01-15T16:14:17+00:00 | ||
2 | Website 2 | up | 2017-02-08T11:15:45+00:00 | 2023-03-19T14:30:30+00:00 | 1.92684E+11 |
2 | Website 2 | down | 2023-03-19T14:30:30+00:00 | 2023-03-19T14:40:19+00:00 | 589000 |
2 | Website 2 | up | 2023-03-19T14:40:19+00:00 | 2023-03-19T15:21:47+00:00 | 2488000 |
2 | Website 2 | down | 2023-03-19T15:21:47+00:00 | 2023-03-19T15:32:31+00:00 | 644000 |
2 | Website 2 | up | 2023-03-19T15:32:31+00:00 | ||
3 | Website 3 | up | 2023-08-22T05:04:23+00:00 | 2023-09-05T00:59:57+00:00 | 1194934000 |
3 | Website 3 | down | 2023-09-05T00:59:57+00:00 | 2023-09-05T01:00:03+00:00 | 6000 |
3 | Website 3 | up | 2023-09-05T01:00:03+00:00 | 2023-09-14T16:31:51+00:00 | 833508000 |
3 | Website 3 | down | 2023-09-14T16:31:51+00:00 | 2023-09-14T16:33:02+00:00 | 71000 |
3 | Website 3 | up | 2023-09-14T16:33:02+00:00 | 2023-12-01T03:44:51+00:00 | 6693109000 |
3 | Website 3 | down | 2023-12-01T03:44:51+00:00 | 2023-12-01T03:47:22+00:00 | 151000 |
3 | Website 3 | up | 2023-12-01T03:47:22+00:00 | 2023-12-01T05:51:31+00:00 | 7449000 |
3 | Website 3 | down | 2023-12-01T05:51:31+00:00 | 2023-12-01T05:54:06+00:00 | 155000 |
3 | Website 3 | up | 2023-12-01T05:54:06+00:00 | 2024-04-21T01:16:35+00:00 | 12252149000 |
3 | Website 3 | down | 2024-04-21T01:16:35+00:00 | 2024-04-21T01:18:59+00:00 | 144000 |
3 | Website 3 | up | 2024-04-21T01:18:59+00:00 | 2024-04-29T08:46:35+00:00 | 718056000 |
3 | Website 3 | down | 2024-04-29T08:46:35+00:00 | 2024-04-29T08:47:46+00:00 | 71000 |
3 | Website 3 | up | 2024-04-29T08:47:46+00:00 | 2024-08-28T07:57:26+00:00 | 10451380000 |
3 | Website 3 | down | 2024-08-28T07:57:26+00:00 | 2024-08-28T07:59:44+00:00 | 138000 |
3 | Website 3 | up | 2024-08-28T07:59:44+00:00 |
(The % aren't based on above, I've just put random figures in to demonstrate)
TestID | WebsiteName | period | uptime |
1 | Website 1 | Dec-24 | 100.00% |
1 | Website 1 | Jan-25 | 99.98% |
1 | Website 1 | Feb-25 | 99.96% |
1 | Website 1 | Mar-25 | etc |
2 | Website 2 | Dec-24 | etc |
2 | Website 2 | Jan-25 | etc |
2 | Website 2 | Feb-25 | etc |
2 | Website 2 | Mar-25 | etc |
3 | Website 3 | Dec-24 | etc |
3 | Website 3 | Jan-25 | etc |
3 | Website 3 | Feb-25 | etc |
3 | Website 3 | Mar-25 | etc |
You already have:
WebsiteName
created_at
ended_at
duration
status
(up/down)
Because: if there's no downtime, your data has no record — but you still want to show 100% uptime.
Steps:
Create a Dates Table with all months you care about.
Cross-join it with your distinct WebsiteName
list.
Result: even "no downtime" months will exist.
Now, filter your original table to only status = "down"
periods.
From this, for each Website + Month:
Sum the duration
(downtime in milliseconds).
Merge (left join) your Website-Month master list with the downtime sums.
If no downtime, downtime = 0.
Simple formula:
Uptime % = (Milliseconds in month - Downtime) / Milliseconds in month
Dates Table
let
StartDate = #date(2022, 1, 1),
EndDate = Date.EndOfMonth(DateTime.Date(DateTime.LocalNow())),
Dates = List.Dates(StartDate, Duration.Days(EndDate - StartDate) + 1, #duration(1,0,0,0)),
TableDates = Table.FromList(Dates, Splitter.SplitByNothing(), {"Date"}),
AddYearMonth = Table.AddColumn(TableDates, "YearMonth", each Date.ToText([Date], "MMM-yy"), type text),
RemoveDay = Table.Distinct(Table.SelectColumns(AddYearMonth, {"YearMonth"}))
in
RemoveDay
Just pull distinct WebsiteName
from your fact table.
Cross-join (all combinations):
CrossJoin = Table.AddColumn(WebsiteList, "Custom", each RemoveDay),
Expand = Table.ExpandTableColumn(CrossJoin, "Custom", {"YearMonth"})
Filter status = "down"
.
Create a YearMonth
column based on created_at
.
Group by WebsiteName
, YearMonth
, SUM of duration
.
Left-join the downtime sums onto your Website-Month table.
If downtime is null, replace it with 0.
Add a column:
DaysInMonth = Date.DaysInMonth(Date.FromText("01-" & [YearMonth]))
Milliseconds in a month:
MillisecondsInMonth = [DaysInMonth] * 24 * 60 * 60 * 1000
Uptime %:
UptimePercent = ([MillisecondsInMonth] - [Downtime]) / [MillisecondsInMonth]
Did I answer your question? Mark my post as a solution! Appreciate your Kudos !!
Are you assuming that if it is not down then it is up? There are no other states ("not live yet", "decommissioned")?
How granular do you need this to be? By the second*, minute, or hour? Keep in mind that uptime percentage per month becomes increasingly meaningless as months have different number of days.
* meaning 31 million entries per year and website
Your sample data indicates near 100% uptime. Not sure what the business value is. You also have the same timestamp for the abutting events. That will further skew the result.
Hi @sgt_pepper The issue lies in calculating uptime from a dataset with start and end dates spanning multiple months. Begin by replacing null ended_at values with the current date to handle ongoing periods. Next, use a date table to align your data with monthly intervals and split durations accordingly. Finally, aggregate the data by month and website, calculating uptime as the proportion of up duration to the total time in the month.
Check out the April 2025 Power BI update to learn about new features.
Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.
User | Count |
---|---|
20 | |
11 | |
10 | |
7 | |
7 |