Use Power Automate to Email a Unique List of Users

I want to preface this by saying that this example will use a small list of users. If you are dealing with a “large” SharePoint list, Excel file, table, or API call with more than ~5,000 rows, this post is not for you. Yes, there are ways around this, but this post is a super simple example to get a unique list of users from a data source.

In this post, I’ll show you how to use a Power Automate Flow to get a unique list of users from a SharePoint list or an Excel file. In both examples, setting up the data should be simple. For the SharePoint list example, you will need access to a site where you can create or modify a list, or at a minimum, access to a list with a person column. For the Excel example, you will need access to OneDrive; if not, the same basic steps will work if you have access to a SharePoint site that has an Excel file in a library.

SharePoint list example:
First, let’s set up the SharePoint list. From a SharePoint site, click the gear icon in the top-right corner, then select Site contents. From the Site Contents page, click on + New and select List.

Depending on when you read this, the following screen might look different than my screenshot. From the window that opens, click on List. Name the list: MyTestList and click the Create button.

Do the following in the newly created list:
Click + Add column –> Person –> Next
In the Create a column window, name it Approver and click Save.


In the SharePoint list, add a few list items, and be sure to include some duplicate values in the Approver field.

Excel example:
Open an Excel file and create two columns: ID and Approver
In the file, create some duplicate email values in the Approver column. Next, we need to create a table from the newly entered data. If you skip this step, the flow will not work. In one of the rows you just created, select a cell, then from the ribbon click Insert, then Table. A little pop-up will appear to confirm what data the table should include. Ensure that “My table has headers” is checked, then click OK to save the change. Again, if you fail to create the table, the flow will not be able to handle the Excel file.



Now, save / upload the Excel file to your OneDrive. You can do this by opening OneDrive from your computer or browsing to it. If you are unsure how to access OneDrive, you can navigate to this URL and select OneDrive from the available options. https://m365.cloud.microsoft/apps?auth=2&home=1

SharePoint list flow example
Navigate to https://make.powerautomate.com/
On the left side of the screen, click My flows, + New flow, then Instant cloud flow.

Yes, you can select automated cloud flow, but for testing, I want to trigger the flow on demand easily.

When the build a flow window opens, enter a flow name, select Manually trigger a flow, and click Create. If you do not see this trigger action, click skip, and you can search for it on the next page that opens.


NOTE: For this demo, I will be using the new flow designer UI. If you are using the classic designer, the same steps will still apply. No, I’m not a fan of the new UI, but I’m guessing people reading this might be forced to use it.

In the flow designer, click the + button below Manually trigger a flow action. In the window that opens to the left, search for SharePoint, and select get items.



In the Get Items action, select your SharePoint site and your list name.
Click the + below the Get Items action, in the window search for select, and select it in the Data Operations group.
With the Select action…selected, click the From field on the right side of the screen, then click the little lightning bolt icon to open the dynamic content menu. If you don’t see the icon, simply put a forward / slash in the From field, then select insert dynamic content. From the dynamic content window, there should be a single item listed below Get items, click on it.


While the Select action is still open, click the T to the right of the Map field values. Do not skip this step!

Open the dynamic content window, search for Approver, and select Approver Email.
Yes, you will see an error, but you can ignore it.


At this point, the flow should look like this. Save and test it (top right).



Next, we are going to use a union function to get a unique list (collection) of email addresses from the Select action.

Below the Select action, click the +, and search for Compose and click it. With the Compose action open, open the expression window. In the same way you opened the dynamic content window, select fx or select insert expression.
In the expression window, enter this: union(body('Select'), body('Select'))
Click the Add button. Your Compose should now look like this:

This is a little hack that combines the Select action with itself, returning a unique set of email addresses.

Save and test the flow. If you click on the Compose action and look at the Outputs, you should see a unique list of emails.

But wait…there’s more! How do you use this in an email action? Add another Compose action below the current one. In the new Compose action, open the expression window and enter the following: join(outputs('Compose'), ';')

Again, save and test. The output of the second Compose can now be used in a send email action. In the send an email action, click the small gear icon on the right side of the To field, select dynamic content, and select the second Compose. You can rename the flow actions to make them less generic.

Complete flow:

Excel flow example
In this section, I’m going to breeze through the steps much more quickly. If you are unsure about something, scroll up and review the SharePoint list flow example section.

Create a new flow that is manually triggered.
Add an action to get the rows from your Excel file. In my example, I’m using the list rows present in a table under the Excel Online (Business) section.


In the Excel action, I’ve selected the following values. Yes, yours will be different. If you are unable to choose a value for Table, it means you skipped the key step at the beginning of this post. Start over…


Below the Excel action, click the + and add a Select action.
From: body/value
Map: click the T –> dynamic content, in the Excel section, click See more, and select Approver

The populated Select should look like this:

If you did not click the list T on the right of the Map field, this will not work.

Below the Select action, click the +, and search for Compose and click it. With the Compose action open, open the expression window. In the same way you opened the dynamic content window, select fx or select insert expression.
In the expression window, enter this: union(body('Select'), body('Select'))
Click the Add button. Your Compose should now look like this:

How do you use this in an email action? Add another Compose action below the current one. In the new Compose action, open the expression window and enter the following: join(outputs('Compose'), ';')

Again, save and test. The output of the second Compose can now be used in a send email action. In the send an email action, click the small gear icon on the right side of the To field, select dynamic content, and select the second Compose. You can rename the flow actions to make them less generic.


Once you get the data from the Excel file, the steps are the same as the first example in this post. In another post like this, I step through this Excel action in more detail: Power Automate: Sending Emails Using Excel Data

For the most part, the above examples apply to many data sources beyond SharePoint or an Excel file stored in OneDrive or SharePoint. Using the Union and Join functions are the core parts of the flow to get a unique list of users or other duplicate values.

Power BI + SharePoint Integration Is Ending Here’s Your PowerShell Script

With the typical heads-up, Microsoft announced that a somewhat underutilized feature in SharePoint Online is being phased out. The ability to create a Power BI report directly from a SharePoint list or library using the visualize this list feature will soon be a thing of the past. From this blog post, they provide an HTTP GET call to dump all of the reports, but not much else. What I’m providing in my script is more insight into what sites and lists have the reports bound to them. What my script is lacking is the detail needed to know who created the report or owns it. If you know how to capture this, please let me know, and I’ll update the script.

For this script to run, you will need:
PnP PowerShell
An Azure App Registration
Ability to access your tenant using az login
– It’s recommended that you use an account with Fabric administrator rights, tenant admin rights, or another level of access that can see all-thing-PowerBI.
If you are unable to utilize an App Reg, I provided another script below this one that will output the same basic detail, minus the target list display name.

In your PowerShell terminal, log in using AZ Login, then run the script.

#region Configuration UPDATE THE CONFIG VALUES
$config = @{
    ApiEndpoint = "https://api.powerbi.com/v1.0/myorg/admin/groups?`$filter=endswith(name,'SPList')&`$expand=reports,datasets,users&`$top=5000"
    OutputFile = "PowerBI_SPList_Workspaces_WithDatasources_$(Get-Date -Format 'yyyyMMdd_HHmmss').csv"

    ClientId = "UPDATE THIS"
    Thumbprint = "UPDATE THIS"
    Tenant = "YourTenant" #contoso
}

$listDisplayNameCache = @{}
#endregion

#region Helper Functions
function Get-ListGuidFromEmail {
    param($emailAddress)
    if ($emailAddress -match "SPList-([a-fA-F0-9]{8}-[a-fA-F0-9]{4}-[a-fA-F0-9]{4}-[a-fA-F0-9]{4}-[a-fA-F0-9]{12})") {
        return $matches[1]
    }
    return $null
}

function Get-SharePointListDisplayName {
    param([string]$siteUrl, [string]$listGuid)
    
    try {
        Connect-PnPOnline -Url $siteUrl -ClientId $config.ClientId -Thumbprint $config.Thumbprint -Tenant "$($config.Tenant).onmicrosoft.com" -ErrorAction Stop
        $list = Get-PnPList -Identity $listGuid -ErrorAction Stop
        return $list.Title
    }
    catch {
        Write-Host "      ⚠️  Could not retrieve list name for GUID $listGuid : $($_.Exception.Message)" -ForegroundColor Yellow
        return $null
    }
}

function Get-ListInfoFromWorkspace {
    param($workspace, $siteUrl)
    
    $listGuid = $null
    $listDisplayName = $null
    
    foreach ($user in $workspace.users) {
        $extractedGuid = Get-ListGuidFromEmail -emailAddress $user.emailAddress
        if ($extractedGuid) {
            $listGuid = $extractedGuid
            Write-Host "      Found List GUID: $listGuid" -ForegroundColor Cyan
            
            $cacheKey = "$siteUrl|$listGuid"
            if ($listDisplayNameCache.ContainsKey($cacheKey)) {
                $listDisplayName = $listDisplayNameCache[$cacheKey]
                Write-Host "      Using cached: $listDisplayName" -ForegroundColor Gray
            }
            else {
                Write-Host "      Querying SharePoint..." -ForegroundColor Yellow
                $listDisplayName = Get-SharePointListDisplayName -siteUrl $siteUrl -listGuid $listGuid
                if ($listDisplayName) {
                    Write-Host "      ✅ Display Name: $listDisplayName" -ForegroundColor Green
                    $listDisplayNameCache[$cacheKey] = $listDisplayName
                }
            }
            break
        }
    }
    
    return @{
        Guid = $listGuid
        DisplayName = $listDisplayName
    }
}

function New-WorkspaceResult {
    param(
        $workspace,
        $report = $null,
        $datasetId = $null,
        $datasource = $null,
        $listInfo = $null,
        $errorStatus = $null
    )
    
    $usersList = ($workspace.users | ForEach-Object { "$($_.emailAddress) [$($_.groupUserAccessRight)]" }) -join "; "
    
    return [PSCustomObject]@{
        WorkspaceName = $workspace.name
        WorkspaceId = $workspace.id
        WorkspaceType = $workspace.type
        WorkspaceState = $workspace.state
        ReportName = $report.name ?? "No Reports"
        ReportId = $report.id
        ReportWebUrl = $report.webUrl
        DatasetId = $datasetId
        DatasourceType = $datasource.datasourceType ?? $errorStatus
        DatasourceId = $datasource.datasourceId
        SharePointSiteUrl = $datasource.connectionDetails.url
        SharePointListGuid = $listInfo.Guid
        SharePointListDisplayName = $listInfo.DisplayName
        ConnectionString = $datasource.connectionDetails.url
        GatewayId = $datasource.gatewayId
        UserCount = $workspace.users.Count
        Users = $usersList
        QueryDate = Get-Date
    }
}
#endregion

#region Authentication
Write-Host "=== POWER BI WORKSPACE & DATASOURCE QUERY ===" -ForegroundColor Yellow
Write-Host "API Endpoint: $($config.ApiEndpoint)" -ForegroundColor Cyan
Write-Host "Output File: $($config.OutputFile)" -ForegroundColor Cyan

try {
    Write-Host "`nGetting access token..." -ForegroundColor Yellow
    $token = az account get-access-token --resource https://analysis.windows.net/powerbi/api --query accessToken -o tsv
    if (!$token) { throw "Failed to get access token" }
    Write-Host "✅ Token retrieved" -ForegroundColor Green
}
catch {
    Write-Host "❌ Failed to authenticate: $_" -ForegroundColor Red
    Write-Host "Please run 'az login' first" -ForegroundColor Yellow
    exit 1
}
#endregion

#region Query Power BI Workspaces
try {
    Write-Host "`nQuerying Power BI Admin API..." -ForegroundColor Yellow
    $headers = @{ "Authorization" = "Bearer $token"; "Content-Type" = "application/json" }
    $response = Invoke-RestMethod -Uri $config.ApiEndpoint -Headers $headers -Method Get
    $workspaces = $response.value
    
    Write-Host "✅ Found $($workspaces.Count) workspace(s)" -ForegroundColor Green
    if ($workspaces.Count -eq 0) { exit 0 }
}
catch {
    Write-Host "❌ API call failed: $_" -ForegroundColor Red
    exit 1
}
#endregion

#region Process Workspaces
Write-Host "`n=== PROCESSING WORKSPACES & DATASOURCES ===" -ForegroundColor Yellow
$allResults = @()
$counter = 1

foreach ($workspace in $workspaces) {
    Write-Host "`n--- Workspace $counter of $($workspaces.Count) ---" -ForegroundColor Cyan
    Write-Host "Name: $($workspace.name)" -ForegroundColor White
    
    $datasetIds = $workspace.reports | Where-Object { $_.datasetId } | Select-Object -ExpandProperty datasetId -Unique
    
    if (-not $datasetIds) {
        Write-Host "No datasets found" -ForegroundColor Yellow
        $allResults += New-WorkspaceResult -workspace $workspace
        $counter++
        continue
    }
    
    Write-Host "Datasets: $($datasetIds.Count)" -ForegroundColor Green
    
    foreach ($datasetId in $datasetIds) {
        Write-Host "`n  Dataset: $datasetId" -ForegroundColor Yellow
        
        try {
            $datasourceUrl = "https://api.powerbi.com/v1.0/myorg/admin/datasets/$datasetId/datasources"
            $datasourcesResponse = Invoke-RestMethod -Uri $datasourceUrl -Headers $headers -Method Get
            $datasources = $datasourcesResponse.value
            
            if (-not $datasources -or $datasources.Count -eq 0) {
                Write-Host "  ⚠️  No datasources found" -ForegroundColor Yellow
                continue
            }
            
            Write-Host "  ✅ Found $($datasources.Count) datasource(s)" -ForegroundColor Green
            
            foreach ($datasource in $datasources) {
                Write-Host "    Type: $($datasource.datasourceType)" -ForegroundColor Gray
                
                $siteUrl = $datasource.connectionDetails.url
                $listInfo = Get-ListInfoFromWorkspace -workspace $workspace -siteUrl $siteUrl
                $reportsUsingDataset = $workspace.reports | Where-Object { $_.datasetId -eq $datasetId }
                
                foreach ($report in $reportsUsingDataset) {
                    $allResults += New-WorkspaceResult -workspace $workspace -report $report -datasetId $datasetId -datasource $datasource -listInfo $listInfo
                }
            }
        }
        catch {
            Write-Host "  ⚠️  Error: $($_.Exception.Message)" -ForegroundColor Yellow
            $reportsUsingDataset = $workspace.reports | Where-Object { $_.datasetId -eq $datasetId }
            foreach ($report in $reportsUsingDataset) {
                $allResults += New-WorkspaceResult -workspace $workspace -report $report -datasetId $datasetId -errorStatus "Error retrieving"
            }
        }
        
        Start-Sleep -Milliseconds 100
    }
    
    $counter++
}
#endregion

#region Export and Summary
if ($allResults.Count -gt 0) {
    Write-Host "`n=== EXPORTING RESULTS ===" -ForegroundColor Yellow
    $allResults | Export-Csv -Path $config.OutputFile -NoTypeInformation
    Write-Host "✅ Exported to: $($config.OutputFile)" -ForegroundColor Green
    Write-Host "   Total records: $($allResults.Count)" -ForegroundColor Cyan
}

Write-Host "`n=== SUMMARY ===" -ForegroundColor Yellow
Write-Host "Total Workspaces: $($workspaces.Count)" -ForegroundColor Cyan
Write-Host "Total Records: $($allResults.Count)" -ForegroundColor Cyan

$uniqueSites = $allResults | Where-Object { $_.SharePointSiteUrl } | Select-Object -ExpandProperty SharePointSiteUrl -Unique
if ($uniqueSites) {
    Write-Host "`nUnique SharePoint Sites: $($uniqueSites.Count)" -ForegroundColor Cyan
    $uniqueSites | ForEach-Object { Write-Host "  - $_" -ForegroundColor Gray }
}

Write-Host "`n✅ Query completed successfully!" -ForegroundColor Green
#endregion


———- Other script ———————-

Here’s the same basic script, but without needing the Azure App Reg. You will need to grab the SharePointListGuid value, navigate to the corresponding site, open the site contents page, and then click on any list or library settings. In the URL, swap the existing value with the one from SharePointListGuid. This will display the name of the list to which the current report is bound.

Example:
https://taco.sharepoint.com/sites/Mas-Taco/_layouts/15/listedit.aspx?List=5ae2c019-75a5-495c-a5a9-beaf47b71346
You’ll want to swap the GUID after ?List=

#region Configuration
$config = @{
    ApiEndpoint = "https://api.powerbi.com/v1.0/myorg/admin/groups?`$filter=endswith(name,'SPList')&`$expand=reports,datasets,users&`$top=5000"
    OutputFile = "PowerBI_SPList_Workspaces_$(Get-Date -Format 'yyyyMMdd_HHmmss').csv"
}
#endregion

#region Helper Functions
function Get-ListGuidFromEmail {
    param($emailAddress)
    if ($emailAddress -match "SPList-([a-fA-F0-9]{8}-[a-fA-F0-9]{4}-[a-fA-F0-9]{4}-[a-fA-F0-9]{4}-[a-fA-F0-9]{12})") {
        return $matches[1]
    }
    return $null
}

function New-WorkspaceResult {
    param(
        $workspace,
        $report = $null,
        $datasetId = $null,
        $datasource = $null,
        $listGuid = $null
    )
    
    $usersList = ($workspace.users | ForEach-Object { "$($_.emailAddress) [$($_.groupUserAccessRight)]" }) -join "; "
    
    return [PSCustomObject]@{
        WorkspaceName = $workspace.name
        WorkspaceId = $workspace.id
        WorkspaceType = $workspace.type
        WorkspaceState = $workspace.state
        ReportName = $report.name ?? "No Reports"
        ReportId = $report.id
        ReportWebUrl = $report.webUrl
        DatasetId = $datasetId
        DatasourceType = $datasource.datasourceType
        DatasourceId = $datasource.datasourceId
        SharePointSiteUrl = $datasource.connectionDetails.url
        SharePointListGuid = $listGuid
        ConnectionString = $datasource.connectionDetails.url
        GatewayId = $datasource.gatewayId
        UserCount = $workspace.users.Count
        Users = $usersList
        QueryDate = Get-Date
    }
}
#endregion

#region Authentication
Write-Host "=== POWER BI WORKSPACE QUERY (Simple Version) ===" -ForegroundColor Yellow
Write-Host "API Endpoint: $($config.ApiEndpoint)" -ForegroundColor Cyan
Write-Host "Output File: $($config.OutputFile)" -ForegroundColor Cyan

try {
    Write-Host "`nGetting access token..." -ForegroundColor Yellow
    $token = az account get-access-token --resource https://analysis.windows.net/powerbi/api --query accessToken -o tsv
    if (!$token) { throw "Failed to get access token" }
    Write-Host "✅ Token retrieved" -ForegroundColor Green
}
catch {
    Write-Host "❌ Failed to authenticate: $_" -ForegroundColor Red
    Write-Host "Please run 'az login' first" -ForegroundColor Yellow
    exit 1
}
#endregion

#region Query Power BI Workspaces
try {
    Write-Host "`nQuerying Power BI Admin API..." -ForegroundColor Yellow
    $headers = @{ "Authorization" = "Bearer $token"; "Content-Type" = "application/json" }
    $response = Invoke-RestMethod -Uri $config.ApiEndpoint -Headers $headers -Method Get
    $workspaces = $response.value
    
    Write-Host "✅ Found $($workspaces.Count) workspace(s)" -ForegroundColor Green
    if ($workspaces.Count -eq 0) { exit 0 }
}
catch {
    Write-Host "❌ API call failed: $_" -ForegroundColor Red
    exit 1
}
#endregion

#region Process Workspaces
Write-Host "`n=== PROCESSING WORKSPACES ===" -ForegroundColor Yellow
$allResults = @()
$counter = 1

foreach ($workspace in $workspaces) {
    Write-Host "`n--- Workspace $counter of $($workspaces.Count) ---" -ForegroundColor Cyan
    Write-Host "Name: $($workspace.name)" -ForegroundColor White
    
    # Extract list GUID from workspace users (no SharePoint connection needed)
    $listGuid = $null
    foreach ($user in $workspace.users) {
        $extractedGuid = Get-ListGuidFromEmail -emailAddress $user.emailAddress
        if ($extractedGuid) {
            $listGuid = $extractedGuid
            Write-Host "List GUID: $listGuid" -ForegroundColor Cyan
            break
        }
    }
    
    $datasetIds = $workspace.reports | Where-Object { $_.datasetId } | Select-Object -ExpandProperty datasetId -Unique
    
    if (-not $datasetIds) {
        Write-Host "No datasets found" -ForegroundColor Yellow
        $allResults += New-WorkspaceResult -workspace $workspace -listGuid $listGuid
        $counter++
        continue
    }
    
    Write-Host "Datasets: $($datasetIds.Count)" -ForegroundColor Green
    
    foreach ($datasetId in $datasetIds) {
        Write-Host "  Dataset: $datasetId" -ForegroundColor Yellow
        
        try {
            $datasourceUrl = "https://api.powerbi.com/v1.0/myorg/admin/datasets/$datasetId/datasources"
            $datasourcesResponse = Invoke-RestMethod -Uri $datasourceUrl -Headers $headers -Method Get
            $datasources = $datasourcesResponse.value
            
            if (-not $datasources -or $datasources.Count -eq 0) {
                Write-Host "  No datasources found" -ForegroundColor Yellow
                continue
            }
            
            Write-Host "  ✅ Found $($datasources.Count) datasource(s)" -ForegroundColor Green
            
            foreach ($datasource in $datasources) {
                $siteUrl = $datasource.connectionDetails.url
                Write-Host "    Site: $siteUrl" -ForegroundColor Gray
                
                $reportsUsingDataset = $workspace.reports | Where-Object { $_.datasetId -eq $datasetId }
                
                foreach ($report in $reportsUsingDataset) {
                    $allResults += New-WorkspaceResult -workspace $workspace -report $report -datasetId $datasetId -datasource $datasource -listGuid $listGuid
                }
            }
        }
        catch {
            Write-Host "  ⚠️  Error: $($_.Exception.Message)" -ForegroundColor Yellow
            $reportsUsingDataset = $workspace.reports | Where-Object { $_.datasetId -eq $datasetId }
            foreach ($report in $reportsUsingDataset) {
                $allResults += New-WorkspaceResult -workspace $workspace -report $report -datasetId $datasetId -listGuid $listGuid
            }
        }
        
        Start-Sleep -Milliseconds 100
    }
    
    $counter++
}
#endregion

#region Export and Summary
if ($allResults.Count -gt 0) {
    Write-Host "`n=== EXPORTING RESULTS ===" -ForegroundColor Yellow
    $allResults | Export-Csv -Path $config.OutputFile -NoTypeInformation
    Write-Host "✅ Exported to: $($config.OutputFile)" -ForegroundColor Green
    Write-Host "   Location: $(Get-Location)\$($config.OutputFile)" -ForegroundColor Green
    Write-Host "   Total records: $($allResults.Count)" -ForegroundColor Cyan
}

Write-Host "`n=== SUMMARY ===" -ForegroundColor Yellow
Write-Host "Total Workspaces: $($workspaces.Count)" -ForegroundColor Cyan
Write-Host "Total Records: $($allResults.Count)" -ForegroundColor Cyan

# Show unique SharePoint sites found
$uniqueSites = $allResults | Where-Object { $_.SharePointSiteUrl } | Select-Object -ExpandProperty SharePointSiteUrl -Unique
if ($uniqueSites) {
    Write-Host "`nUnique SharePoint Sites: $($uniqueSites.Count)" -ForegroundColor Cyan
    $uniqueSites | ForEach-Object { Write-Host "  - $_" -ForegroundColor Gray }
}

# Show unique list GUIDs found
$uniqueListGuids = $allResults | Where-Object { $_.SharePointListGuid } | Select-Object -ExpandProperty SharePointListGuid -Unique
if ($uniqueListGuids) {
    Write-Host "`nUnique SharePoint List GUIDs: $($uniqueListGuids.Count)" -ForegroundColor Cyan
    $uniqueListGuids | ForEach-Object { Write-Host "  - $_" -ForegroundColor Gray }
}

Write-Host "`n✅ Query completed successfully!" -ForegroundColor Green
#endregion

From a target SharePoint list or library, locate the Integrate tab, click it, click Power BI, and from there, you should see your report name.

Get SharePoint Site ID Using Microsoft Graph

Using Microsoft Graph, how do you get the SharePoint site ID using the site path?

Endpoint being used: https://learn.microsoft.com/en-us/graph/api/site-getbypath?view=graph-rest-1.0

# Function to acquire an access token and return the token and its expiration time
function Get-GraphAccessToken {
    $tokenUrl = "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/token"
    $body = @{
        client_id     = $clientId
        scope         = "https://graph.microsoft.com/.default"
        client_secret = $clientSecret
        grant_type    = "client_credentials"
    }

    $tokenResponse = Invoke-RestMethod -Uri $tokenUrl -Method Post -Body $body -ContentType "application/x-www-form-urlencoded"
    $accessToken = $tokenResponse.access_token
    $expiresIn = $tokenResponse.expires_in
    $tokenExpiration = (Get-Date).AddSeconds($expiresIn - 300) # Refresh 5 minutes before expiration

    return $accessToken, $tokenExpiration
}
# Acquire the token
$accessToken, $tokenExpiration = Get-GraphAccessToken
$headers = @{
    Authorization  = "Bearer $accessToken"
    "Content-Type" = "application/json"
}

# Target SharePoint site
$siteUrl = "https://taco.sharepoint.com/sites/test-site" 

# Extract the hostname and site path
$uri = [System.Uri]::new($siteUrl)
$hostname = $uri.Host
$sitePath = $uri.AbsolutePath.TrimStart('/')

# Define the endpoint URL to get the SharePoint site ID
$graphSiteUrl = "https://graph.microsoft.com/v1.0/sites/${hostname}:/${sitePath}"

# Make the request to get the site ID
$siteResponse = Invoke-RestMethod -Uri $graphSiteUrl -Headers $headers -Method Get

# Extract the relevant part of the site ID
$siteIdParts = $siteResponse.id.Split(',')
$siteId = "$($siteIdParts[1]),$($siteIdParts[2])"

# Output the site ID
Write-Output "Site Collection ID: $($siteResponse.siteCollection.hostname)"
Write-Output "Site ID--------> $siteId"
Write-Output "Site Display Name: $($siteResponse.displayName)"
Write-Output "Site Web URL: $($siteResponse.webUrl)"

This will return a value like 2480e89d-303a-4f38-b4fe-27f824ff88ac,d605ce5c-f356-422a-84fe-1d7820bc9e6d , which represents the site collection ID and the site ID.

Site collection ID: 2480e89d-303a-4f38-b4fe-27f824ff88ac
Site ID: d605ce5c-f356-422a-84fe-1d7820bc9e6d

PowerShell Currency Conversion Using FRED

I’m working on a project that requires currency conversions between the US dollar and the Euro. In the most basic case, the project has two requirements: backfill historical Euro amounts and get current Euro amounts using a transaction date and USD value. Plugging into an API to get the current currency exchange rate is simple enough, but finding an open or free dataset with 10+ years of currency transactions was another thing. Google and GPT/LLMs are littered with what appear to be free sites, but they are limited to ~100 API calls, or their dataset is not deep enough for my use case. I landed on a great tool provided by the Federal Reserve named FRED, short for Federal Reserve Economic Data. FRED is a free site with APIs sitting on a treasure trove of data. When I started this project, I simply went to the FRED and downloaded the dataset I needed (link), but I wanted to ensure that my process was current and could handle new transactions for years to come. Using the FRED API requires signing up for a free account. You will want a FRED API key to follow along with this demo.

What I’m going to demo in this post: creating a FRED account, using PowerShell to read from an Excel file, querying an API, writing back to the Excel file

FRED account:
Visit the https://fred.stlouisfed.org/ site, click My Account (top right), and click Create New Account when the modal opens. After you’ve created an account, navigate to the My Account page and click API Keys in the left nav. On the API Keys page, click the Request API Key button, input some text in the description box, click the agreement checkbox, and then click Request API Key. Link to the API Key page: https://fredaccount.stlouisfed.org/apikeys

For this demo, I’ve created a simple Excel file with the following columns and datatypes: TransDate (date), USD (currency), ConversionDate (date), EUR (currency)

To interact with an Excel file from PowerShell, I went with the ImportExcel module. In VsCode or your IDE of choice, run this command: Install-Module ImportExcel -Scope CurrentUser

I will test reading from the Excel file, loop through the rows, and output their values to get the ball rolling and ensure the ImportExcel module works.
$excelPath: location of the Excel file
$worksheetName: name of the worksheet/tab where the data is stored (optional)
$excelData: imported Excel data

$excelPath = "C:\code\CurrencyDemo.xlsx"
$worksheetName = "Historical"
$excelData = Import-Excel -Path $excelPath -WorksheetName $worksheetName

foreach($row in $excelData){
    Write-Output "Transction date: $($row.TransDate) Amount: $($row.USD) USD"
}

Next, I will test my connection to the FRED API, returning a sample transaction. There are two important things to note in the next script. The $series variable is bound to the USD to Euro Spot exchange rate value; if you need to work with a different currency, visit the Daily Rates page and filter by the Geographies or use the site search if you cannot find what you are looking for there. If you type Peso in the site search, it will suggest the Mexican Peso to U.S. Dollar. Clicking on the search result will open the page for that conversion, and the page will reveal the $series value needed for the conversion. The Peso to USD series is DEXMXUS (look at the URL or the value to the right of the conversion overview). The next important variable to note is $date; this is obvious for this example, but you can query the API for larger data ranges if needed and work with the larger API response.

# Your FRED API Key
$apiKey = "75fa2e6ce85_taco_805016ea4d764c5"

# Set the parameters
$series = "DEXUSEU"  # This is the series ID for USD to Euro exchange rate
$date = "2024-01-16"

# Construct the API URL
$url = "https://api.stlouisfed.org/fred/series/observations?series_id=$series&observation_start=$date&observation_end=$date&api_key=$apiKey&file_type=json"

# Make the API request
$response = Invoke-RestMethod -Uri $url -Method Get

# Check if we got a result
if ($response.observations.Count -gt 0) {
    $usd_to_eur_rate = [double]$response.observations[0].value
    $eur_to_usd_rate = [math]::Round(1 / $usd_to_eur_rate, 4)
    Write-Output "The USD to Euro conversion rate on $date was: $usd_to_eur_rate"
    Write-Output "The Euro to USD conversion rate on $date was: $eur_to_usd_rate"
} else {
    Write-Output "No data available for the specified date."
}

In the last script for this demo, I will combine all the parts and provide an example for dealing with input dates that are Saturday or Sunday. From what I’ve learned on this journey, currencies are not typically traded seven days a week, so if an input date falls on a weekend, there needs to be an offset to the preceding Friday. This script must be extended in a production scenario to deal with major holidays.

function CurrencyConversion {
    param (
        $convDate,
        $usdAmount
    )

    # Parse the input string into a datetime object
    $parsedDate = [datetime]::ParseExact($convDate.Date, "M/d/yyyy HH:mm:ss", [Globalization.CultureInfo]::InvariantCulture)
    $apiDateValue = $parsedDate.ToString("yyyy-MM-dd")

    # Your FRED API Key
    $apiKey = "75fa2e6ce85_taco_805016ea4d764c5"
    $seriesId = "EXUSEU"

    # Construct the API URL
    $apiUrl = "https://api.stlouisfed.org/fred/series/observations?series_id=$seriesId&api_key=$apiKey&file_type=json&observation_start=$apiDateValue&observation_end=$apiDateValue"

    # Make the API call
    $response = Invoke-RestMethod -Uri $apiUrl

    # Check if there are any observations for the given date
    if ($response.observations.count -gt 0) {
        # Assuming the first observation is the one we're interested in
        $usd_to_eur_rate = [double]$response.observations[0].value
        $eur_to_usd_rate = [math]::Round(1 / $usd_to_eur_rate, 4)
    }
    else {
        Write-Host "No data found for ................................................ $parsedDate"
    }

    $convertedValue = $usdAmount * $eur_to_usd_rate
    return $convertedValue
    
}

function DateConversion {
    param (
        $conversionDate
    )

    # Check if 'conversionDate' is not null or empty
    if (-not [string]::IsNullOrWhiteSpace($conversionDate)) {
        # Parse the input date into a datetime object
        $targetDate = [datetime]::Parse($conversionDate)

        # Check if the day is Saturday or Sunday
        if ($targetDate.DayOfWeek -eq [DayOfWeek]::Saturday) {
            $conversionDate = $targetDate.AddDays(-1).ToString("yyyy-MM-dd")
        }
        elseif ($targetDate.DayOfWeek -eq [DayOfWeek]::Sunday) {
            $conversionDate = $targetDate.AddDays(-2).ToString("yyyy-MM-dd")
        }
    }

    return $conversionDate
}

$excelPath = "C:\code\CurrencyDemo.xlsx"
$worksheetName = "Historical"
$excelData = Import-Excel -Path $excelPath -WorksheetName $worksheetName

foreach ($row in $excelData) {

    $transDate = $row.TransDate
    $amountUSD = $row.USD
    $submittedDate = $null

    # Get the date for the currency conversion
    if (-not [string]::IsNullOrWhiteSpace($transDate)) {
        $submittedDate = DateConversion -conversionDate $transDate
    }

    # Check if both Submitted Date and USD are not null or empty
    if (-not [string]::IsNullOrWhiteSpace($submittedDate) -and 
        -not [string]::IsNullOrWhiteSpace($amountUSD)) {
        $convertedValue = CurrencyConversion -convDate $submittedDate -usdAmount $amountUSD
    }

    Write-Output "Converted value for $($amountUSD) USD on $($submittedDate.ToShortDateString()): $convertedValue"

    #update the excel row with the output
    $row.EUR = $convertedValue
    $row.ConversionDate = $submittedDate
}

# Export the updated data to Excel
$excelData | Export-Excel -Path $excelPath -WorksheetName $worksheetName 

To streamline the script, I created two helper functions. One handles the weekend-to-Friday conversion, and the other makes the API call to FRED. The script will loop over all of the rows in the spreadsheet, handle the currency conversion, and then bulk-write the output to the target Excel file. The highlighted values notate where a weekend date was passed, and the script handled the offset to the preceding Friday.

Yes, some places in the script need improvement, but I wanted to provide a simple example for handling currency conversion with PowerShell and historical dates. As always, please don’t hesitate to reach out or leave a comment if any part of this doesn’t make sense or if there’s a more-better way of doing things.

Note:
Be mindful of the number of calls you make to the API in a given timeframe. I was testing this process and hammered on the API with ~1,000 calls and hit an API limit error. Adding a simple pause to the script fixed the problem. i.e. after X calls, pause for X seconds.

Power Automate: Sending Emails Using Excel Data

Recently, a user asked me how they could send emails using a flow, with Excel as the data source. I’m going to provide an in-depth guide that covers every step needed to accomplish this.

What’s needed to follow along:
Access to a Power Platform environment
URL: https://make.powerautomate.com/
OneDrive (but you can use SharePoint)
Excel file

Open a new Excel file and populate it with the following columns:
RowID, Employee Name, Manager Email, Email Sent
Enter data in each of the cells, but ensure that RowID has a unique value for each row.

After the data is entered, you will want to create a table encompassing the cells that were just populated. With one of the cells selected, click the Insert tab at the top of the file, then click Table.

When the Create Table popup opens, ensure it includes the rows and columns you created, and the my tables has headers box should be checked. Click Ok to close the popup.

From the File tab, click Save As and save the file to a OneDrive location. You can save the file to your desktop or another location, then copy it to a folder in OneDrive. It makes no difference how the file gets there, but for this flow to work, it needs to be in OneDrive or a SharePoint location you have access to.

From a browser, navigate to https://make.powerautomate.com/. On the left side of the screen, click on My flows. After the page refreshes, click + New flow and select Instant cloud flow.

In the Flow name field, input a name for your flow. From the Choose how to trigger this flow, select Manually trigger a flow, then click the Create button.

In the flow design canvans, click the + below the Manually trigger a flow action, and select Add an action.

When the Add an action window opens on the left side of the screen, you will notice that you have a bunch of actions to choose from. The first box can be used to search for actions; here, enter Excel list rows. Note how the actions are grouped by the connector type, in our case, Excel Online. The other key to note here is to see more blue text to the right of the Excel Online group. If you don’t see the action you are looking for, always remember to click the see more link. Doing this will disable all available actions for the group. Go ahead and click on the List rows present in a table action.

From the Location dropdown, select OneDrive if that’s where you saved your file; else, select SharePoint or wherever you saved the file. For this example, the file must be in a storage location to which the flow can connect. Next, click the dropdown for the Document Library. In my example, you’ll notice that I have several options to choose from; if you get the same result, you’ll need to select each option and then click the File dropdown to see if you are in the correct location. Yes, it’s annoying.

Once you have the correct Document Library selected, click the dropdown for the File option, navigate to where your Excel file is stored, and select it.

From the Table dropdown, select the available option. In my example, the only table from the Excel file is Table2. There is a good chance yours is named something else.

You can verify the table name by returning to your Excel file, clicking the Table Design tab, and then noting the Table Name value.

Returning to the flow design canvas, click the + Add an action below the Excel action.

From the search box, enter send email. With the list of actions narrowed down, select the Send an email (v2) option. If you are reading this in the future, the option might (V_something else), but make sure you are in the Office 365 Outlook group of action. Do NOT use the action from the Mail group.

In the Send an email action, click in the To field. If the dynamic content is not visible, click the Enter custom value text and the little lightning bolt icon.

When the dynamic content window opens, you will see a few options for fields you can select to populate the To value of the Send an email action. From the list of options, select Manager Email. This will pull in the manager’s email from the spreadsheet.

Click in the Subject field and enter some text; here, you will notice I input Example. After you’ve entered some text, click the lightning bolt icon again, but this time select Employee Name.

The last thing we will populate in this action is the Body of the email. Again, feel free to input some text here, and like the other fields, you can use values from the dynamic content menu to use values from the Excel file.

The completed email action will look like this.

You will notice that a For each was automatically added to the flow design canvas. Why? If you think about what the flow is doing from a process standpoint, it added the For each to loop over each row in the Excel file. For each row in the spreadsheet, do ____. In our example, it will send an email, and each email will reference the current item in the loop.

Below the Send an email action, click + Add an action.

From the add an action search box, enter excel update row, and select the Update a row action. We will use this action to update the spreadsheet for each email sent.

In the Update a row action, navigate to where your Excel file is stored. The steps you followed when connecting to the file a few steps back in this process will apply here. Once you’ve connected to the file, click in the Key Column and select RowID from the available choices.

Click into the Key Value field, open the dynamic content window, and select RowID. We are telling the action that we want to update the Excel file row corresponding to the current item in For each loop.

Click the dropdown for the Advanced parameters field and select Email Sent. I entered Yes in the Email Sent field.

The completed flow should look like this: We trigger the flow, get the table from the Excel file, loop over each row in the table (for each), send an email, and update the spreadsheet for each item in the loop.

At the top of the screen, click Save, then Test

Select Manually when the next window opens, click Continue, and last but not least, click Run flow.

When the flow is finished running, you should see green check marks next to each flow action.

The emails were sent to each manager with their employee in the subject and body of each email. To test sending emails, I like to use https://temp-mail.org/en/

Navigating back to the Excel file, the Email sent value is Yes for each row in the file.

That’s it! There are lots of steps, but I hope it covers everything you need to create a workflow that does the exact same thing.

Use Python to upload a LARGE file to SharePoint

In this post, I will quickly show how to use the Office365-REST-Python-Client library to upload a large file to a SharePoint library.

For this to work, you will need a certificate, Azure App registration, and access to the target SharePoint site. I outlined all the necessary parts in this post: Modernizing Authentication in SharePoint Online Note: the linked post will output a .PFX cert, and the script below will need a .PEM cert. You can use this Python command to convert the cert:

from cryptography.hazmat.primitives import serialization
from cryptography.hazmat.primitives.asymmetric import rsa
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import hashes
from cryptography import x509
from cryptography.hazmat.primitives.serialization import pkcs12

# Load the PFX file
pfx_file = open('C:\\path_to_cert\\EXAMPLE.pfx', 'rb').read()  # replace with your pfx file path
(private_key, certificate, additional_certificates) = pkcs12.load_key_and_certificates(pfx_file, None, default_backend())

with open('NewCERT.pem', 'wb') as f:
    f.write(private_key.private_bytes(
        encoding=serialization.Encoding.PEM,
        format=serialization.PrivateFormat.TraditionalOpenSSL,
        encryption_algorithm=serialization.NoEncryption()
    ))
    f.write(certificate.public_bytes(serialization.Encoding.PEM))

# install this library if needed
# pip install cryptography

Ok, with that out of the way, you can use this script to upload to a SharePoint library. In the script, I’ve commented out the line that would be used to upload to a folder within a library.

import os
from office365.sharepoint.client_context import ClientContext

cert_credentials = {
    "tenant": "abcxyz-1234-4567-8910-0e3d638792fb",
    "client_id": "abcddd-4444-4444-cccc-123456789111",
    "thumbprint": "7D8D8DF7D8D2F4DF8DF45D4FD8FD48DF5D8D",
    "cert_path": "RestClient\\NewCERT.pem"
}
ctx = ClientContext("https://tacoranch.sharepoint.com/sites/python").with_client_certificate(**cert_credentials)
current_web = ctx.web.get().execute_query()
print("{0}".format(current_web.url))

filename = "LargeExcel.xlsx"
folder_path = "C:\\code\py"

def print_upload_progress(offset):
    # type: (int) -> None
    file_size = os.path.getsize(local_path)
    print(
        "Uploaded '{0}' bytes from '{1}'...[{2}%]".format(
            offset, file_size, round(offset / file_size * 100, 2)
        )
    )

#upload to a folder
#target_url = "Shared Documents/folderA/folderB"

target_url = "Shared Documents"
target_folder = ctx.web.get_folder_by_server_relative_url(target_url)
size_chunk = 1000000
local_path = os.path.join(folder_path, filename)
with open(local_path, "rb") as f:
    uploaded_file = target_folder.files.create_upload_session(
        f, size_chunk, print_upload_progress, filename
    ).execute_query()

print("File {0} has been uploaded successfully".format(uploaded_file.serverRelativeUrl))

If you receive an error stating you don’t have access, double-check that you’ve added the App Registration to the target SharePoint site permissions. Again, this is noted in the blog post linked at the being of this post.

Consider this a workaround until MS Graph is out of its latest beta and there’s more support for easily uploading to SharePoint.

What if you need to upload a file and set a column value? When working with SharePoint via the API, you must be mindful of the column names. The column name in the UI might not be the same as the internal name, so I will use the script above as my starting point and add the following script to the end. In this example, I’m setting two fields: ReportName and ReportDate.

#get the file that was just uploaded
file_item = uploaded_file.listItemAllFields

# Define a dictionary of field names and their new values
fields_to_update = {
    "ReportName": "My TPS Report",
    "ReportDate": datetime.datetime.now().isoformat(),
    # Add more fields here as needed
}

# Iterate over the dictionary and update each field
for field_name, new_value in fields_to_update.items():
    file_item.set_property(field_name, new_value)

# Commit the changes
file_item.update()
ctx.execute_query()

print("Report fields were updated")

How do you get a list of all the columns in a list or library? The script below will output all the column’s internal and display names.

from office365.sharepoint.client_context import ClientContext

cert_credentials = {
    "tenant": "abcxyz-1234-4567-8910-0e3d638792fb",
    "client_id": "abcddd-4444-4444-cccc-123456789111",
    "thumbprint": "7D8D8DF7D8D2F4DF8DF45D4FD8FD48DF5D8D",
    "cert_path": "RestClient\\NewCERT.pem"
}

ctx = ClientContext("https://tacoranch.sharepoint.com/sites/python").with_client_certificate(**cert_credentials)
current_web = ctx.web.get().execute_query()
print("{0}".format(current_web.url))

# Get the target list or library
list_or_library = ctx.web.lists.get_by_title('TPS-Reports')

# Load the fields
fields = list_or_library.fields.get().execute_query()

# Print the field names
for field in fields:
    print("Field internal name: {0}, Field display name: {1}".format(field.internal_name, field.title))



SharePoint Audit Using Purview

Today, I had a customer ask the following question:
“How can I pull a report showing all of the files in a library that have not been viewed?”
Typically, users request a report showing all the items or files accessed, so this request was unique.

You can run an audit search on most everything in a cloud tenant using Microsoft Purview. Every file downloaded, viewed, list item opened, edited, deleted, page views, Onedrive actions, Teams actions, and the list goes on and on.

In Purview, click on the Audit link in the left nav, and it will open the audit search page.
Select the time range you want to target
Activities: Accessed file
Record types: SharePointFileOperation
Search name: this can be anything you want, i.e. SP Library Search
File, folder, or site: https://taco.sharepoint.com/sites/test/TheLibrary/*
Workload: SharePoint
The key items to note are the record type and file options. You can use a wildcard * to return results for everything in the target library. This will return much information, so filtering after the report is downloaded is needed. Once you’ve populated the fields, click Search, then wait a bit for it to complete. The amount of data in your tenant and current workloads will determine how long the search will take.


The completed search will look like this:

Clicking on the report name will open a detailed view of the audit search. From the results page, click the Export button and wait a few minutes for the file to be generated. If the page gets stuck at 0%, refresh your browser, and it should trigger the download.

Next, I needed to get all the files in the SharePoint library. To do this, I used PowerShell to connect to the target site and then downloaded the file info to a CSV.

# Connect to the SharePoint site interactively
Connect-PnPOnline -Url "https://taco.sharepoint.com/sites/test" -Interactive

# Specify the target library
$libraryName = "TheLibrary"

# Get all files from the target library
$files = Get-PnPListItem -List $libraryName -Fields "FileLeafRef", "ID", "FileRef", "Created", "Modified", "UniqueId", "GUID"

# Create an empty array to store the file metadata
$fileMetadata = @()

# Loop through each file and extract the relevant metadata
foreach ($file in $files) {
    $fileMetadata += [PSCustomObject]@{
        FileName = $file["FileLeafRef"]
        ID = $file["ID"]
        GUID = $file["GUID"]
        UniqueId = $file["UniqueId"]
        URL = $file["FileRef"]
        Created = $file["Created"]
        Modified = $file["Modified"]
    }
}

# Export the file metadata to a CSV file
$fileMetadata | Export-Csv -Path "C:\code\library_audit.csv" -NoTypeInformation

If you take anything away from this post, please take note of this: Purview uses a field named ListItemUniqueId to identify a SharePoint file or list item. My first thought was to use the GUID from the SharePoint library to match up to the Purview data. This is 100% incorrect! From SharePoint, UniqueId is the field that matches the Purview field ListItemUniqueId.

Basic logic:
SELECT SharePoint.*
FROM SharePoint
INNER JOIN Purview
ON SharePoint.UniqueId = Purview.ListItemUniqueId

I used Power BI to format and mash the exported Purview data with the SharePoint data. Power BI is unnecessary; you can easily use Power Query in Excel to do the same thing. Below, I’m including my M code statement that parses the JSON from the Purview file, and counts how many times the files were accessed and the last time they were accessed.

let
    Source = Csv.Document(File.Contents("C:\ian\20240329\Purview_Audit.csv"),[Delimiter=",", Columns=8, Encoding=1252, QuoteStyle=QuoteStyle.None]),
    #"Promoted Headers" = Table.PromoteHeaders(Source, [PromoteAllScalars=true]),
    #"Changed Type" = Table.TransformColumnTypes(#"Promoted Headers",{{"RecordId", type text}, {"CreationDate", type datetime}, {"RecordType", Int64.Type}, {"Operation", type text}, {"UserId", type text}, {"AuditData", type text}, {"AssociatedAdminUnits", type text}, {"AssociatedAdminUnitsNames", type text}}),
    #"Parsed JSON" = Table.TransformColumns(#"Changed Type",{{"AuditData", Json.Document}}),
    #"Expanded AuditData" = Table.ExpandRecordColumn(#"Parsed JSON", "AuditData", {"ListItemUniqueId", "SourceFileExtension", "SourceFileName", "ObjectId"}, {"ListItemUniqueId", "SourceFileExtension", "SourceFileName", "ObjectId"}),
    #"Removed Columns" = Table.RemoveColumns(#"Expanded AuditData",{"AssociatedAdminUnits", "AssociatedAdminUnitsNames", "RecordId"}),
    #"Renamed Columns" = Table.RenameColumns(#"Removed Columns",{{"ObjectId", "File URL"}, {"SourceFileName", "File Name"}, {"SourceFileExtension", "File Extension"}}),
    #"Filtered Rows" = Table.SelectRows(#"Renamed Columns", each ([File Extension] <> "aspx")),
    #"Filtered Rows1" = Table.SelectRows(#"Filtered Rows", each true),
    #"Removed Columns1" = Table.RemoveColumns(#"Filtered Rows1",{"Operation", "RecordType"}),
    #"Grouped Rows" = Table.Group(#"Removed Columns1", {"ListItemUniqueId"}, {{"View Count", each Table.RowCount(_), Int64.Type}, {"Last Viewed", each List.Max([CreationDate]), type nullable datetime}})
in
    #"Grouped Rows"

Still working in Power Query, I created a new query to show what SharePoint files had not been accessed. My Purview license is limited to 6 months‘ worth of data, so this is one hindrance to painting a full picture of what has/has not been accessed.

let
    Source = Table.NestedJoin(SharePoint_Audit, {"UniqueId"}, Purview_Audit, {"ListItemUniqueId"}, "Purview_Audit", JoinKind.LeftAnti),
    #"Expanded Purview_Audit" = Table.ExpandTableColumn(Source, "Purview_Audit", {"File Name"}, {"Purview_Audit.File Name"}),
    #"Sorted Rows1" = Table.Sort(#"Expanded Purview_Audit",{{"Purview_Audit.P File Name", Order.Descending}}),
    #"Filtered Rows1" = Table.SelectRows(#"Sorted Rows1", each true)
in
    #"Filtered Rows1"

With this data, I can now report to the user what files have not been accessed in the past 6 months.


Effortlessly Trigger a Flow from a Power App: A Simple Step-by-Step Example

In this post, I want to show how easy it is to call a Flow from a Power App. The goal of the Power App is to pass values to the Flow, have it add them together, and return a result.

Starting from the Power Apps portal
click Create –> Blank app, Black canvas app, name the app, for the format option, select tablet, then click Create button.

Power App overview:

Field TypeField NameOption
Text InputTextInputOneFormat: Number
Text InputTextInputTwoFormat: Number
LabelLabelNumberOne
LabelLabelNumberTwo
LabelLabelTotal
LabelLabelMathResult
ButtonButtonCalc

Flow overview:
The Flow can be created directly in the Power App designer or the Power Platform portal. For this example, I’m going to use the portal.

From https://make.powerapps.com,
Click on New flow and select Automated cloud flow

Click the Skip button at the bottom of the window (this will make sense in a min.)

With the Flow designer open, click PowerApps or search for it, then click on PowerApps (V2)

In this step, add two number inputs to the action

I named my number inputs as follow: inputNumberOne and inputNumberTwo

The Flow will respond to the app using the Repost to a PowerApp or flow action. For the output, again select number, and I named mine outputNumber .

the formula should be: add(triggerBody()[‘number’],triggerBody()[‘number_1’])

Name the Flow as Flow do Math, and save it. You can test the Flow simply by clicking the Test button and supplying two input values. The Flow can be named something different, but this name aligns with the below example.

Back in the PowerApp, click the Power Automate icon.

With the Power Automate window open, click on Add flow and select the newly created Flow, or search for it and select it.

On the app design surface, select the button and update its OnSelect property to:
Set(varNumber, FlowDoMath.Run(TextInputOne.Text,TextInputTwo.Text).outputnumber)

Select the LabelMathResult field and set its Text value to varNumber

Run the app, input values in the text fields, then click the button.

What just happened?


The values of the two text input fields were passed to the Flow, it added them together and returned the value in the outputnumber field; that value was then set to the varNumber variable.

In future posts, I will dive deeper into more complex examples.



How do you find ALL the Flows that reference a SharePoint site or list?

I asked this question when I first started down the path of learning about Flow:
How do you find all the Flows running on or referencing a SharePoint list?

UPDATE / EDIT – READ THIS Part
Before you start on this, please ensure that your account or the account you are using to run the script has sufficient permissions to the target environment(s).

$oneFlow = Get-AdminFlow -FlowName "00000-ae95-4cab-96d8-0000000" -EnvironmentName "222222-4943-4068-8a2d-11111111"

$refResources = $oneFlow.Internal.properties.referencedResources
Write-Host $refResources



If you run that command and look at the returned properties and see an error, that means you do not have the correct permissions to move forward. You can check your permissions in the Power Platform admin center: https://admin.powerplatform.microsoft.com/

/end of update

Think about it: someone in your company creates a Flow that runs when a SharePoint item is updated. Fast forward a year or so, and that coworker has moved on, and the Flow needs to be updated. If you work for a small company or one that hasn’t fallen in love with Power Platform and Flow, you’re likely in luck, and finding the Flow will take a few minutes. In my case, there are currently 2,712 Flows in my tenant that span several environments.

The PowerShell script I’ve created will query a tenant using the Get-AdminFlow command, return all Flows, and then loop through them. The script can be adjusted to target a single environment using the EnvironmentName parameter. Note: running the script using the Get-Flow action will return all the Flows your AD account can access.

#Install-Module AzureAD
#Install-Module -Name Microsoft.PowerApps.Administration.PowerShell  
#Install-Module -Name Microsoft.PowerApps.PowerShell -AllowClobber 

#connect-AzureAD

function Get-UserFromId($id) {
    try {
        $usr = Get-AzureADUser -ObjectId $id
        return $usr.displayName
    }
    catch {
        return $null
    }
}

#get all flows in the tenant
$adminFlows = Get-AdminFlow 

#set path for output
$Path = "$([Environment]::GetFolderPath('Desktop'))\Flow_Search_for_SharePoint_$(Get-Date -Format "yyyyMMdd_HHmmss").csv"

#set target site
$targetSPSite = "https://yourTenant.sharepoint.com/sites/yourSITE"
$targetSPList = "4f4604d2-fa8f-4bae-850f-4908b4708b07"
$targetSites = @()

foreach ($gFlow in $adminFlows) {

    #check if the flow references the target site
    $refResources = $gFlow.Internal.properties.referencedResources | Where-Object { $_.resource.site -eq $targetSPSite }

    #check if the flow references the target list
    #$refResources = $gFlow.Internal.properties.referencedResources | Where-Object { $_.resource.list -eq $targetSPList }

    if ($refResources -ne $null) {

        #optional - get the user who created the Flow
        $createdBy = Get-UserFromId($gFlow.internal.properties.creator.userId)

        $row = @{}
        $row.Add("EnvironmentName", $gFlow.EnvironmentName)
        $row.Add("Name", $gFlow.DisplayName)
        $row.Add("FlowEnabled", $gFlow.Enabled)
        $row.Add("FlowGUID", $gFlow.FlowName)
        $row.Add("CreatedByUser", $createdBy)
        $row.Add("CreatedDate", $gFlow.CreatedTime)
        $row.Add("LastModifiedDate", $gFlow.lastModifiedTime)
        
        $targetSites += $(new-object psobject -Property $row)
    }
}

#output to csv
$targetSites | Export-Csv -Path $Path -NoTypeInformation

If you don’t want to get the display name of the user who created the Flow, comment out the part of the script that calls the Get-UserFromId function, and you won’t need to connect to Azure.

And to answer my original question: How do you find all the Flows running on or referencing a SharePoint list?
In the script, comment out the part of the script that references $targetSPSite and un-comment $targetSPList. You can get the GUID of the list by navigating to list settings and looking at the URL. Another option is to open the list, view the Page Source, then look for the “listId” property.

In a future post(s), I will outline how to search for all Flows that use different connectors, Dynamics 365 tables (dataverse), triggered from Power Apps, or other objects. All of the info is in the properties of the Flow; getting to it can be a little fun.

Power App and SharePoint List Form Hide Field on New Item Form

How do you hide a field on a PowerApp when opening a new form? The approach below uses a single screen form instead of multiple screens for the various forms.

I started by creating a new SharePoint list and added two text fields:
Not on New Form
On New Form
Using the customize form option, I entered the Power App designer.

When the PowerApp designer opens, it will look like this:

To help see what’s going on with the form mode, add a text label to the form and set its Text property to: "Form Mode: " & Text(SharePointForm1.Mode)

Select the field (Data Card) that should not appear on the new item form, then select the Visible property. For the Visible property, enter the following: If(SharePointForm1.Mode = 1, false, true) . If your SharePointForm1 is named something else, use it instead of the value I presented.

Breaking down the formula a little: If the SharePoint form mode is equal to 1, visible should be false, else true.

Save and publish the app, then check if it’s functional as planned.

New item form with Form Mode: 1

Display item form with Form Mode: 2

Edit item form with Form Mode: 0