# Function to acquire an access token and return the token and its expiration time
function Get-GraphAccessToken {
$tokenUrl = "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/token"
$body = @{
client_id = $clientId
scope = "https://graph.microsoft.com/.default"
client_secret = $clientSecret
grant_type = "client_credentials"
}
$tokenResponse = Invoke-RestMethod -Uri $tokenUrl -Method Post -Body $body -ContentType "application/x-www-form-urlencoded"
$accessToken = $tokenResponse.access_token
$expiresIn = $tokenResponse.expires_in
$tokenExpiration = (Get-Date).AddSeconds($expiresIn - 300) # Refresh 5 minutes before expiration
return $accessToken, $tokenExpiration
}
# Acquire the token
$accessToken, $tokenExpiration = Get-GraphAccessToken
$headers = @{
Authorization = "Bearer $accessToken"
"Content-Type" = "application/json"
}
# Target SharePoint site
$siteUrl = "https://taco.sharepoint.com/sites/test-site"
# Extract the hostname and site path
$uri = [System.Uri]::new($siteUrl)
$hostname = $uri.Host
$sitePath = $uri.AbsolutePath.TrimStart('/')
# Define the endpoint URL to get the SharePoint site ID
$graphSiteUrl = "https://graph.microsoft.com/v1.0/sites/${hostname}:/${sitePath}"
# Make the request to get the site ID
$siteResponse = Invoke-RestMethod -Uri $graphSiteUrl -Headers $headers -Method Get
# Extract the relevant part of the site ID
$siteIdParts = $siteResponse.id.Split(',')
$siteId = "$($siteIdParts[1]),$($siteIdParts[2])"
# Output the site ID
Write-Output "Site Collection ID: $($siteResponse.siteCollection.hostname)"
Write-Output "Site ID--------> $siteId"
Write-Output "Site Display Name: $($siteResponse.displayName)"
Write-Output "Site Web URL: $($siteResponse.webUrl)"
This will return a value like 2480e89d-303a-4f38-b4fe-27f824ff88ac,d605ce5c-f356-422a-84fe-1d7820bc9e6d , which represents the site collection ID and the site ID.
Site collection ID: 2480e89d-303a-4f38-b4fe-27f824ff88ac Site ID: d605ce5c-f356-422a-84fe-1d7820bc9e6d
I’m working on a project that requires currency conversions between the US dollar and the Euro. In the most basic case, the project has two requirements: backfill historical Euro amounts and get current Euro amounts using a transaction date and USD value. Plugging into an API to get the current currency exchange rate is simple enough, but finding an open or free dataset with 10+ years of currency transactions was another thing. Google and GPT/LLMs are littered with what appear to be free sites, but they are limited to ~100 API calls, or their dataset is not deep enough for my use case. I landed on a great tool provided by the Federal Reserve named FRED, short for Federal Reserve Economic Data. FRED is a free site with APIs sitting on a treasure trove of data. When I started this project, I simply went to the FRED and downloaded the dataset I needed (link), but I wanted to ensure that my process was current and could handle new transactions for years to come. Using the FRED API requires signing up for a free account. You will want a FRED API key to follow along with this demo.
What I’m going to demo in this post: creating a FRED account, using PowerShell to read from an Excel file, querying an API, writing back to the Excel file
FRED account: Visit the https://fred.stlouisfed.org/ site, click My Account (top right), and click Create New Account when the modal opens. After you’ve created an account, navigate to the My Account page and click API Keys in the left nav. On the API Keys page, click the Request API Key button, input some text in the description box, click the agreement checkbox, and then click Request API Key. Link to the API Key page: https://fredaccount.stlouisfed.org/apikeys
For this demo, I’ve created a simple Excel file with the following columns and datatypes: TransDate (date), USD (currency), ConversionDate (date), EUR (currency)
To interact with an Excel file from PowerShell, I went with the ImportExcel module. In VsCode or your IDE of choice, run this command: Install-Module ImportExcel -Scope CurrentUser
I will test reading from the Excel file, loop through the rows, and output their values to get the ball rolling and ensure the ImportExcel module works. $excelPath: location of the Excel file $worksheetName: name of the worksheet/tab where the data is stored (optional) $excelData: imported Excel data
Next, I will test my connection to the FRED API, returning a sample transaction. There are two important things to note in the next script. The $series variable is bound to the USD to Euro Spot exchange rate value; if you need to work with a different currency, visit the Daily Rates page and filter by the Geographies or use the site search if you cannot find what you are looking for there. If you type Peso in the site search, it will suggest the Mexican Peso to U.S. Dollar. Clicking on the search result will open the page for that conversion, and the page will reveal the $series value needed for the conversion. The Peso to USD series is DEXMXUS (look at the URL or the value to the right of the conversion overview). The next important variable to note is $date; this is obvious for this example, but you can query the API for larger data ranges if needed and work with the larger API response.
# Your FRED API Key
$apiKey = "75fa2e6ce85_taco_805016ea4d764c5"
# Set the parameters
$series = "DEXUSEU" # This is the series ID for USD to Euro exchange rate
$date = "2024-01-16"
# Construct the API URL
$url = "https://api.stlouisfed.org/fred/series/observations?series_id=$series&observation_start=$date&observation_end=$date&api_key=$apiKey&file_type=json"
# Make the API request
$response = Invoke-RestMethod -Uri $url -Method Get
# Check if we got a result
if ($response.observations.Count -gt 0) {
$usd_to_eur_rate = [double]$response.observations[0].value
$eur_to_usd_rate = [math]::Round(1 / $usd_to_eur_rate, 4)
Write-Output "The USD to Euro conversion rate on $date was: $usd_to_eur_rate"
Write-Output "The Euro to USD conversion rate on $date was: $eur_to_usd_rate"
} else {
Write-Output "No data available for the specified date."
}
In the last script for this demo, I will combine all the parts and provide an example for dealing with input dates that are Saturday or Sunday. From what I’ve learned on this journey, currencies are not typically traded seven days a week, so if an input date falls on a weekend, there needs to be an offset to the preceding Friday. This script must be extended in a production scenario to deal with major holidays.
function CurrencyConversion {
param (
$convDate,
$usdAmount
)
# Parse the input string into a datetime object
$parsedDate = [datetime]::ParseExact($convDate.Date, "M/d/yyyy HH:mm:ss", [Globalization.CultureInfo]::InvariantCulture)
$apiDateValue = $parsedDate.ToString("yyyy-MM-dd")
# Your FRED API Key
$apiKey = "75fa2e6ce85_taco_805016ea4d764c5"
$seriesId = "EXUSEU"
# Construct the API URL
$apiUrl = "https://api.stlouisfed.org/fred/series/observations?series_id=$seriesId&api_key=$apiKey&file_type=json&observation_start=$apiDateValue&observation_end=$apiDateValue"
# Make the API call
$response = Invoke-RestMethod -Uri $apiUrl
# Check if there are any observations for the given date
if ($response.observations.count -gt 0) {
# Assuming the first observation is the one we're interested in
$usd_to_eur_rate = [double]$response.observations[0].value
$eur_to_usd_rate = [math]::Round(1 / $usd_to_eur_rate, 4)
}
else {
Write-Host "No data found for ................................................ $parsedDate"
}
$convertedValue = $usdAmount * $eur_to_usd_rate
return $convertedValue
}
function DateConversion {
param (
$conversionDate
)
# Check if 'conversionDate' is not null or empty
if (-not [string]::IsNullOrWhiteSpace($conversionDate)) {
# Parse the input date into a datetime object
$targetDate = [datetime]::Parse($conversionDate)
# Check if the day is Saturday or Sunday
if ($targetDate.DayOfWeek -eq [DayOfWeek]::Saturday) {
$conversionDate = $targetDate.AddDays(-1).ToString("yyyy-MM-dd")
}
elseif ($targetDate.DayOfWeek -eq [DayOfWeek]::Sunday) {
$conversionDate = $targetDate.AddDays(-2).ToString("yyyy-MM-dd")
}
}
return $conversionDate
}
$excelPath = "C:\code\CurrencyDemo.xlsx"
$worksheetName = "Historical"
$excelData = Import-Excel -Path $excelPath -WorksheetName $worksheetName
foreach ($row in $excelData) {
$transDate = $row.TransDate
$amountUSD = $row.USD
$submittedDate = $null
# Get the date for the currency conversion
if (-not [string]::IsNullOrWhiteSpace($transDate)) {
$submittedDate = DateConversion -conversionDate $transDate
}
# Check if both Submitted Date and USD are not null or empty
if (-not [string]::IsNullOrWhiteSpace($submittedDate) -and
-not [string]::IsNullOrWhiteSpace($amountUSD)) {
$convertedValue = CurrencyConversion -convDate $submittedDate -usdAmount $amountUSD
}
Write-Output "Converted value for $($amountUSD) USD on $($submittedDate.ToShortDateString()): $convertedValue"
#update the excel row with the output
$row.EUR = $convertedValue
$row.ConversionDate = $submittedDate
}
# Export the updated data to Excel
$excelData | Export-Excel -Path $excelPath -WorksheetName $worksheetName
To streamline the script, I created two helper functions. One handles the weekend-to-Friday conversion, and the other makes the API call to FRED. The script will loop over all of the rows in the spreadsheet, handle the currency conversion, and then bulk-write the output to the target Excel file. The highlighted values notate where a weekend date was passed, and the script handled the offset to the preceding Friday.
Yes, some places in the script need improvement, but I wanted to provide a simple example for handling currency conversion with PowerShell and historical dates. As always, please don’t hesitate to reach out or leave a comment if any part of this doesn’t make sense or if there’s a more-better way of doing things.
Note: Be mindful of the number of calls you make to the API in a given timeframe. I was testing this process and hammered on the API with ~1,000 calls and hit an API limit error. Adding a simple pause to the script fixed the problem. i.e. after X calls, pause for X seconds.
As of today (June 28th, 2024), the Microsoft.Graph and PnP.PowerShell modules do not play well together in an Azure Automation Account runbook. I upgraded my PnP.PowerShell version to 2.4.0 two weeks ago, and was able to execute a Runbook that uses PnP.PowerShell and Graph without issue. Starting a few days ago, the same Runbook would not run and output the following errors:
ClientCertificateCredential authentication failed: Could not load type 'Microsoft.Identity.Client.BaseAbstractApplicationBuilder`1' from assembly 'Microsoft.Identity.Client, Version=4.50.0.0, Culture=neutral, PublicKeyToken=
Could not load type 'Microsoft.Graph.Authentication.AzureIdentityAccessTokenProvider' from assembly 'Microsoft.Graph.Core, Version=1.25.1.0, Culture=neutral, PublicKeyToken=
The fix is simple. Remove the PnP.PowerShell 2.4.0 module from your Automation Account. From your desktop, open a Powershell terminal and enter the following: Install-Module -Name PnP.PowerShell -RequiredVersion 2.3.0 Save-Module -Name PnP.PowerShell -RequiredVersion 2.3.0 -Path c:\PATH Zip the exported folder, creating a new file named PnP.PowerShell.zip Upload the zip file to the Automation Account.
You should start looking at it if you are not already using the Runtime Environments feature in an Automation Account. Using the feature would allow for easier testing of module updates before updating the core module in the Automation Account. A simple strategy would be to create a Runtime Environment for testing and one for production/release. Then, from your runbook(s), you can plug into the testing runtime, test it out, then update to the production/relase env. https://learn.microsoft.com/en-us/azure/automation/runtime-environment-overview
Starting a year or two ago, Microsoft announced it would stop supporting and/or blocking access to Azure Access Control Services (ACS) and the SharePoint Add-In model. This is important because ACS has been used for many years to grant app/script API access to a SharePoint site(s), and you likely have many sites where this has been used. Moving forward, Azure Access Control (AAC) will be used in place of ACS.
Historically, you would start the permissions journey by generating the client ID and secret at this endpoint: _layouts/15/AppRegNew.aspx From there, you grant the newly created identity access to a tenant, sites, lists, libraries, or a combination. _layouts/15/appinv.aspx
With that out of the way, how do you wire up a new connection to SharePoint Online, allowing PowerShell, Python, script, or app access to the SharePoint API or the Microsoft Graph API?
Overview of what I’m doing: Create a self-signed cert Add the cert to your personal cert store and upload it to Azure, creating an App Registration. Adjust permissions as needed. Grant the App Registration access to a specific SharePoint site. Use the newly created credentials to access the SharePoint site.
#create cert with a password
New-PnPAzureCertificate `
-CommonName "Demo_SP_Azure_2024" `
-OutPfx c:\code\Demo_SP_Azure_2024.pfx `
-OutCert c:\code\Demo_SP_Azure_2024.cer `
-CertificatePassword (ConvertTo-SecureString -String "Taco@Good" -AsPlainText -Force) `
-ValidYears 1
#import the cert. for this to work, run as Admin.
Import-PfxCertificate `
-Exportable `
-CertStoreLocation Cert:\LocalMachine\My `
-FilePath c:\code\Demo_SP_Azure_2024.pfx `
-Password (ConvertTo-SecureString -String "Taco@Good" -AsPlainText -Force)
I highly suggest not skipping the ‘-interactive’ part for the next command. It will open a browser window where you must authenticate with an account with adequate permissions to create a new App Registration in Azure. The script’s most important thing to note is SharePointApplicationPermissions Site.Selected. Why is this important? This is extremely useful if you want to limit permissions to a single SharePoint site and not every site in the tenant.
After that runs, the output will include the Client ID and Certificate Thumprint. Take note of it, or you can open Azure, navigate to App Registrations, and select all applications. In the left nav, click Certificates & secrets, where you’ll find the thumbprint; again, in the left nav, click Overview, and you’ll see the Application ID, aka Client ID.
In the next two commands, you will connect to the SharePoint Admin site interactive, then grant the newly created App Registration write access to the target SharePoint site.
Connect-PnPOnline -Url "https://tacoranch-admin.sharepoint.com" `
-Interactive
#grant the App Reg write access to the site
Grant-PnPAzureADAppSitePermission `
-AppId 'a95ddafe-6eca-488a-b26a-dc62a64d9105' `
-DisplayName 'Demo_SP_Azure_2024' `
-Site 'https://tacoranch.sharepoint.com/sites/cert-demo' `
-Permissions Write
Now that the App Registration can access the SharePoint site, how do you connect to it using the certificate and thumbprint?
Error(s) and fixes: Error: Grant-PnPAzureADAppSitePermission: {"error":{"code":"accessDenied","message":"Access denied","innerError":{"date":"2024-03-21T16:29:47","request-id":"","client-request-id":""}}} Fix: Ensure the account running this command Grant-PnPAzureADAppSitePermission , has access to the target SharePoint site.
Error: Connect-PnPOnline: A configuration issue is preventing authentication - check the error message from the server for details. You can modify the configuration in the application registration portal. See https://aka.ms/msal-net-invalid-client for details. Original exception: AADSTS700027: The certificate with identifier used to sign the client assertion is not registered on application. [Reason - The key was not found., Thumbprint of key used by client: '6C5891197B54B9535D179D6D9DD7D6D351039D8Z', Please visit the Azure Portal, Graph Explorer or directly use MS Graph to see configured keys for app Id 'eb7f9fbc-f4ee-4a48-a008-49d6bcdc6c40'. Review the documentation at https://docs.microsoft.com/en-us/graph/deployments to determine the corresponding service endpoint and https://docs.microsoft.com/en-us/graph/api/application-get?view=graph-rest-1.0&tabs=http to build a query request URL, such as 'https://graph.microsoft.com/beta/applications/aeb7f9fbc-f4ee-4a48-a008-49d6bcdc6c40']. Trace ID: 3e1e7ab3-60c0-4126-acb8-e2fdb6e28000 Correlation ID: 62ddc80b-aeb2-49c5-8c31-83a04e70bf6e Timestamp: 2024-04-01 12:21:59Z Fix: This one is easy; ensure you use the correct Client ID and thumbprint. You can get this from the app registration page in the Azure portal.
I asked this question when I first started down the path of learning about Flow: How do you find all the Flows running on or referencing a SharePoint list?
UPDATE / EDIT – READ THIS Part Before you start on this, please ensure that your account or the account you are using to run the script has sufficient permissions to the target environment(s).
If you run that command and look at the returned properties and see an error, that means you do not have the correct permissions to move forward. You can check your permissions in the Power Platform admin center: https://admin.powerplatform.microsoft.com/
/end of update
Think about it: someone in your company creates a Flow that runs when a SharePoint item is updated. Fast forward a year or so, and that coworker has moved on, and the Flow needs to be updated. If you work for a small company or one that hasn’t fallen in love with Power Platform and Flow, you’re likely in luck, and finding the Flow will take a few minutes. In my case, there are currently 2,712 Flows in my tenant that span several environments.
The PowerShell script I’ve created will query a tenant using the Get-AdminFlow command, return all Flows, and then loop through them. The script can be adjusted to target a single environment using the EnvironmentName parameter. Note: running the script using the Get-Flow action will return all the Flows your AD account can access.
#Install-Module AzureAD
#Install-Module -Name Microsoft.PowerApps.Administration.PowerShell
#Install-Module -Name Microsoft.PowerApps.PowerShell -AllowClobber
#connect-AzureAD
function Get-UserFromId($id) {
try {
$usr = Get-AzureADUser -ObjectId $id
return $usr.displayName
}
catch {
return $null
}
}
#get all flows in the tenant
$adminFlows = Get-AdminFlow
#set path for output
$Path = "$([Environment]::GetFolderPath('Desktop'))\Flow_Search_for_SharePoint_$(Get-Date -Format "yyyyMMdd_HHmmss").csv"
#set target site
$targetSPSite = "https://yourTenant.sharepoint.com/sites/yourSITE"
$targetSPList = "4f4604d2-fa8f-4bae-850f-4908b4708b07"
$targetSites = @()
foreach ($gFlow in $adminFlows) {
#check if the flow references the target site
$refResources = $gFlow.Internal.properties.referencedResources | Where-Object { $_.resource.site -eq $targetSPSite }
#check if the flow references the target list
#$refResources = $gFlow.Internal.properties.referencedResources | Where-Object { $_.resource.list -eq $targetSPList }
if ($refResources -ne $null) {
#optional - get the user who created the Flow
$createdBy = Get-UserFromId($gFlow.internal.properties.creator.userId)
$row = @{}
$row.Add("EnvironmentName", $gFlow.EnvironmentName)
$row.Add("Name", $gFlow.DisplayName)
$row.Add("FlowEnabled", $gFlow.Enabled)
$row.Add("FlowGUID", $gFlow.FlowName)
$row.Add("CreatedByUser", $createdBy)
$row.Add("CreatedDate", $gFlow.CreatedTime)
$row.Add("LastModifiedDate", $gFlow.lastModifiedTime)
$targetSites += $(new-object psobject -Property $row)
}
}
#output to csv
$targetSites | Export-Csv -Path $Path -NoTypeInformation
If you don’t want to get the display name of the user who created the Flow, comment out the part of the script that calls the Get-UserFromId function, and you won’t need to connect to Azure.
And to answer my original question: How do you find all the Flows running on or referencing a SharePoint list? In the script, comment out the part of the script that references $targetSPSite and un-comment $targetSPList. You can get the GUID of the list by navigating to list settings and looking at the URL. Another option is to open the list, view the Page Source, then look for the “listId” property.
In a future post(s), I will outline how to search for all Flows that use different connectors, Dynamics 365 tables (dataverse), triggered from Power Apps, or other objects. All of the info is in the properties of the Flow; getting to it can be a little fun.
Quick and easy way to search for an item across site collections. I would suggest using one of the Keyword query tools to fine-tune your queries. Also note that SharePoint will limit your search results to 10,000 items, but you can page your results and cycle through them. In the example below, I’m searching across all the site collections off of the /sites/ managed path. With the returned dataset, I’m looping through the rows getting the SPFile of each row.
Error: The Web application at https://sharepoint.sharepointed.com could not be found. Verify that you have typed the URL correctly. If the URL should be serving existing content, the system administrator may need to add a new request URL mapping to the intended application.
I created a .net console app to update some stuff in SharePoint. I received the above error when executing the .exe file with a new service account.
First, I tried granting Shell access to the content db I was working with, but that didn’t solve the problem.
In this example, I needed to search a farm for every site under a managed path. BUT, the sites I’m searching for were built using a 3rd part tool and would not correctly appear in the search results. The problem was related to having Trim Duplicates enabled by default. Easy fix… Set your search property trim duplicates = false.
The search results will display all sites that have Taco as its managed path. If you are not retrieving the results you expect, try switching TrimDuplicates = $false .