A lengthy career working as a SharePoint developer, admin, and architect. I'm now working in the Power Platform and Azure spaces.
What happened to InfoPath?
This started as a simple question: where is the backend Excel file for my group Forms form stored?
By default, a group form will save responses to an Excel file in the SharePoint site associated with the group. Within that site, the file is stored in the Documents, aka Shared Documents library.
Here is a quick way to track down the file:
With the form open, click on Response.
Click on Open in Excel.
Depending on how your SharePoint library is configured, the file will either download to your computer or open in the browser. Open the file and click on the name or click the down arrow next to it.
When the window opens, it will show exactly where the file is stored.
In this example, the file is stored in the Shared Documents library on the Testing site. Again, this example shows Shared Documents, but on the site, it’s actually named Documents.
STOP, I don’t see the window noted in the above screenshot! This more than likely means you are working with a personal form.
Where is the Excel file stored for personal forms? Not where you’d guess and not anywhere worthwhile. The file is more or less saved with the form and is inaccessible other than downloading it. Personal Forms response data is now stored in the author’s OneDrive / SharePoint site.
What if I copy my personal form to a group? What will happen to the Excel file? Don’t do this; just recreate the form from scratch. The copied form will retain the behavior of storing the file with the form, not in SharePoint.
How can I save form responses to a SharePoint list or Dataverse table? You would need to create a Flow to intercept the form response and then save it to the destination.
Will creating a Flow that saves form responses to another destination impact the form saving to Excel? No, the form will always use the backend Excel file as its data storage.
If I download a copy of the backend Excel file, will the downloaded copy be updated with new form submissions? No, the copy is disconnected from the source.
Just for fun, I wanted to see how I could replicate a basic ChatGPT like experience using the tools available to me in the Power Platform. I will cover setting up the Power Automate flow to interface the OpenAI completions endpoint using GPT-4 Turbo and a Power App for the UI. In a future post, I will create another flow(s) to work with the Assistants API and take this to the next level.
A quick look at the app and how it works. The app will keep the chat context active using the completions endpoint until you want to clear it out. Like I said, it’s basic!
Here is what I used to get this up and running: Power App Power Automate flow with the HTTP connector OpenAI account OpenAI API key (credit card required)
Here is the layout of the flow and Power App
The key for this to work and keep the context of the conversation flowing is to pass all of the previous questions and responses back to the API each time a new question is asked. Yes, that can start to add up in terms of burning a lot of tokens for a large chat.
The input is converted to the following format and passed to the API using the app. role: user content: question asked The response from the API is the same. role: assistant content: response from the API Using the two, the collection is built and displayed in a gallery.
Power App setup – ButtonQuestion OnSelect property does the following: Set the button text Set the varQuestion variable to the value of TextInputQuestion Add the question and user role to the colResponses collection Convert the colResponses collection to JSON Cal the f Get HTTP flow passing in the collection with the response from the flow, add the value(s) to the colResponses collection Reset TextInputQuestion Set the button text
Now for the flow! The flow is triggered from the app and passes a value. Messages: the new question from the app and previous responses and questions (collection)
The Parse JSON action will take the Messages input and shape the JSON used in the Select Messages action.
The Select Messages action will use the output of the Parse JSON action to help form the correct input for the HTTP action.
Using a Compose action, the model and messages values are set. Note: in the example, I’m using GPT 4 Turbo preview.
The HTTP Request action uses the following values: Method: Post URL: https://api.openai.com/v1/chat/completions Headers: { “Content-Type”: “application/json”, “Authorization”: “Bearer @{outputs(‘Compose_API_Key’)}” } Body: output of the Compose Message Body
Another Parse JSON action is used to handle the response from the HTTP Request.
In the Select Response action, the output of Parse JSON Response is used to populate the role and content values.
The final action is to respond to the Power App. The respond to PowerApp step will not work if you use the dynamic content value to populate the response. You must use this expression: outputs(‘Select_Response’)
Again, if data is not sent back to the Power App, chances are you missed the sentence before this one.
As I mentioned at the beginning of this post, this is a basic example. If you see anything wrong with the process or places where it could be streamlined, please let me know.
If you are here, you’re likely dealing with a Power Pages cert that’s about to expire or already has. Updating the cert is fairly straightforward, but the kicker is ensuring the user making the update has the correct permissions. When updating the cert, also check that the Website authentication key is still valid. Scroll to the bottom to see more about the auth key.
Permissions that are needed to update the cert and get the authentication key updated: Owner on the App Registration associated with the Portal. o365: Power Platform Admin, and Global Admin As noted in the warning below, it states you can be a Global Admin or App Registration owner. Unless your sys admin is doing the update or if you are a sys admin, gaining access to the App Registration might be the easiest way to update the cert and re-bind the site.
Here are the steps I took to update my portals’ cert: Uploaded the new cert under Manage custom certificates (note cert ID) Navigate to the Set up custom domains and SSL page Delete expired cert In the SSL Bindings section, click +Add new and associate your host with the newly uploaded cert.
Here you can grab the Application ID and locate the associated App Registration.
Open the Azure portal, search for App Registrations, then search by the App ID. Once you have the App Reg., make sure you are an Owner. In the left nav of the app, click Owners, then add yourself or have an admin add you.
After that is updated, navigate back to the Power Pages admin portal and update the Authentication key. NOTE: the site will be offline for a few minutes.
In this post, I want to show how easy it is to call a Flow from a Power App. The goal of the Power App is to pass values to the Flow, have it add them together, and return a result.
Starting from the Power Apps portal click Create –> Blank app, Black canvas app, name the app, for the format option, select tablet, then click Create button.
Power App overview:
Field Type
Field Name
Option
Text Input
TextInputOne
Format: Number
Text Input
TextInputTwo
Format: Number
Label
LabelNumberOne
Label
LabelNumberTwo
Label
LabelTotal
Label
LabelMathResult
Button
ButtonCalc
Flow overview: The Flow can be created directly in the Power App designer or the Power Platform portal. For this example, I’m going to use the portal.
Click the Skip button at the bottom of the window (this will make sense in a min.)
With the Flow designer open, click PowerApps or search for it, then click on PowerApps (V2)
In this step, add two number inputs to the action
I named my number inputs as follow: inputNumberOne and inputNumberTwo
The Flow will respond to the app using the Repost to a PowerApp or flow action. For the output, again select number, and I named mine outputNumber .
the formula should be: add(triggerBody()[‘number’],triggerBody()[‘number_1’])
Name the Flow as Flow do Math, and save it. You can test the Flow simply by clicking the Test button and supplying two input values. The Flow can be named something different, but this name aligns with the below example.
Back in the PowerApp, click the Power Automate icon.
With the Power Automate window open, click on Add flow and select the newly created Flow, or search for it and select it.
On the app design surface, select the button and update its OnSelect property to: Set(varNumber, FlowDoMath.Run(TextInputOne.Text,TextInputTwo.Text).outputnumber)
Select the LabelMathResult field and set its Text value to varNumber
Run the app, input values in the text fields, then click the button.
What just happened?
The values of the two text input fields were passed to the Flow, it added them together and returned the value in the outputnumber field; that value was then set to the varNumber variable.
In future posts, I will dive deeper into more complex examples.
I’m working on a simple report to pull some data from Dataverse into a Power BI report. The data includes some choice fields, and when I first generated the report, I only retrieved the internal value of the fields. So my Status field values look something like this: 10000001 Cleary, no one wants to see this, and I needed to grab the display values: Ordered, Processing, Shipped
Part of the problem was related to using Native Query to pull in the data and not selecting the correct field. Meaning there are two or more columns for each choice field. My query looked something like this: Select title, status, createdon from orders where customerid = ‘875-6309’
The query should have been: Select title, statusname, createdon from orders where customerid = ‘875-6309’
tl;dr try placing the word ‘name’ directly after your choice field name. status would be statusname state would be statename
I asked this question when I first started down the path of learning about Flow: How do you find all the Flows running on or referencing a SharePoint list?
UPDATE / EDIT – READ THIS Part Before you start on this, please ensure that your account or the account you are using to run the script has sufficient permissions to the target environment(s).
If you run that command and look at the returned properties and see an error, that means you do not have the correct permissions to move forward. You can check your permissions in the Power Platform admin center: https://admin.powerplatform.microsoft.com/
/end of update
Think about it: someone in your company creates a Flow that runs when a SharePoint item is updated. Fast forward a year or so, and that coworker has moved on, and the Flow needs to be updated. If you work for a small company or one that hasn’t fallen in love with Power Platform and Flow, you’re likely in luck, and finding the Flow will take a few minutes. In my case, there are currently 2,712 Flows in my tenant that span several environments.
The PowerShell script I’ve created will query a tenant using the Get-AdminFlow command, return all Flows, and then loop through them. The script can be adjusted to target a single environment using the EnvironmentName parameter. Note: running the script using the Get-Flow action will return all the Flows your AD account can access.
#Install-Module AzureAD
#Install-Module -Name Microsoft.PowerApps.Administration.PowerShell
#Install-Module -Name Microsoft.PowerApps.PowerShell -AllowClobber
#connect-AzureAD
function Get-UserFromId($id) {
try {
$usr = Get-AzureADUser -ObjectId $id
return $usr.displayName
}
catch {
return $null
}
}
#get all flows in the tenant
$adminFlows = Get-AdminFlow
#set path for output
$Path = "$([Environment]::GetFolderPath('Desktop'))\Flow_Search_for_SharePoint_$(Get-Date -Format "yyyyMMdd_HHmmss").csv"
#set target site
$targetSPSite = "https://yourTenant.sharepoint.com/sites/yourSITE"
$targetSPList = "4f4604d2-fa8f-4bae-850f-4908b4708b07"
$targetSites = @()
foreach ($gFlow in $adminFlows) {
#check if the flow references the target site
$refResources = $gFlow.Internal.properties.referencedResources | Where-Object { $_.resource.site -eq $targetSPSite }
#check if the flow references the target list
#$refResources = $gFlow.Internal.properties.referencedResources | Where-Object { $_.resource.list -eq $targetSPList }
if ($refResources -ne $null) {
#optional - get the user who created the Flow
$createdBy = Get-UserFromId($gFlow.internal.properties.creator.userId)
$row = @{}
$row.Add("EnvironmentName", $gFlow.EnvironmentName)
$row.Add("Name", $gFlow.DisplayName)
$row.Add("FlowEnabled", $gFlow.Enabled)
$row.Add("FlowGUID", $gFlow.FlowName)
$row.Add("CreatedByUser", $createdBy)
$row.Add("CreatedDate", $gFlow.CreatedTime)
$row.Add("LastModifiedDate", $gFlow.lastModifiedTime)
$targetSites += $(new-object psobject -Property $row)
}
}
#output to csv
$targetSites | Export-Csv -Path $Path -NoTypeInformation
If you don’t want to get the display name of the user who created the Flow, comment out the part of the script that calls the Get-UserFromId function, and you won’t need to connect to Azure.
And to answer my original question: How do you find all the Flows running on or referencing a SharePoint list? In the script, comment out the part of the script that references $targetSPSite and un-comment $targetSPList. You can get the GUID of the list by navigating to list settings and looking at the URL. Another option is to open the list, view the Page Source, then look for the “listId” property.
In a future post(s), I will outline how to search for all Flows that use different connectors, Dynamics 365 tables (dataverse), triggered from Power Apps, or other objects. All of the info is in the properties of the Flow; getting to it can be a little fun.
Last year, my team rolled out a Power App Portal (Power Pages) to allow customers to submit requests with attachments. The attachments are stored in Azure Blob Storage, and we use Cloud Mersive to virus scan the submitted attachments. Not to get too deep into the weeds, the process flows like this: Attachment is uploaded –> lands in the Dataverse Note (annotation) table –> then is shipped to blob storage
Now the problem: users can name a file whatever they like and upload them. This quickly became an issue due to Flow not always being able to find the blobs associated with the request if the filename contained some special characters. Example: MyTrademark®.pdf
There are some great examples online for replacing special characters with a space or another supported character, but I wanted to take a different approach that seemed a lot more efficient to me. Where my example differs is the use of the Filter Array Flow action to only check the characters of the filename, as opposed to looping through each letter of the alphabet and comparing it to each letter in the filename.
Here’s the completed Flow, but I’ll dig into each step in this post.
The Compose Chars action holds the array of characters I will use to validate the characters in the supplied filename. This can be shortened if the input filename is to be set to uppercase or lowercase; only one set of the alphabet is needed.
Compose Org Filename:string('my super 123 longer $%^&^ file /// name ^^^ with junk in it.xlsx') Compose Split Extension:last(split(outputs('Compose_Org_Filename'), '.')) Compose Concat Extension:concat('.', outputs('Compose_Split_Extension')) Compose Get Filename:split(outputs('Compose_Org_Filename'), outputs('Compose_Concat_Extension'))[0]
The point of the Apply to each loop is to iterate over each item in the filename. Note: I’m using a Chunk function to break apart the filename. I first tried using a Spilt function, but there would be no end to what the delimiter might be.
Apply to each:chunk(outputs('Compose_Get_Filename'),1)
Filter array Chars: From: Compose Chars char is equal to Current item Here is the advanced view of the action: @equals(item()?['char'], items('Apply_to_each')) If you think of it like a SQL statement, it would be: Select * from Compose Chars Where Char = Current item The filter checks if the current item in the apply to each loop is in the Compose Chars array.
Condition: empty(body('Filter_array_Chars')) is equal to true If the current item is not in the array, skip it (yes), else start building the filename (no)
This hack is needed due to not being able to set a Flow action equal to itself. Think of it like a programmatic iteration. i++ or i = i + 1
Compose Temp is a placeholder for the varNameBuilder variable. Set variable Name Builder:concat(outputs('Compose_Temp'),items('Apply_to_each'))
Things to consider: Empty filename – What if the filename is nothing but special / unwanted characters? At the end of the Flow, you’d want to use a Length function to check varNameBuilder to see if it’s greater than X. example: !@#@$#$%%^&.pdf The result from the Flow would be .pdf, and updating the filename would fail. To my knowledge, you can’t name a file like that, but you get the point.
Also, I’ve had users upload files with non-English characters, so there is a viable chance that someone, at some point, might upload a file like this: 我喜欢炸玉米饼.pdf
Making the Flow available to other flows – If the Flow is created in a solution, it could be used more like a function, and other Flows in the solution could reference it. This would be a great example of a reusable Child Flow.
How do you hide a field on a PowerApp when opening a new form? The approach below uses a single screen form instead of multiple screens for the various forms.
I started by creating a new SharePoint list and added two text fields: Not on New Form On New Form Using the customize form option, I entered the Power App designer.
When the PowerApp designer opens, it will look like this:
To help see what’s going on with the form mode, add a text label to the form and set its Text property to: "Form Mode: " & Text(SharePointForm1.Mode)
Select the field (Data Card) that should not appear on the new item form, then select the Visible property. For the Visible property, enter the following: If(SharePointForm1.Mode = 1, false, true) . If your SharePointForm1 is named something else, use it instead of the value I presented.
Breaking down the formula a little: If the SharePoint form mode is equal to 1, visible should be false, else true.
Save and publish the app, then check if it’s functional as planned.
I’m working with the Dataverse Web API and ran into this error while trying to write to a table. Invoke-RestMethod : {"error":{"code":"0x8006088a","message":"Resource not found for the segment 'table name'."}}
The fix is to use the plural name of the table, but sometimes my engrish ain’t the bestest and I was struggling to figure out what the plural of Taco Order was (joking). If you want to find all the tables in your environment quickly, you can toss the API URL into a browser, which will list all the plural table names.
Microsoft Power Automate (Flow) has a hard time with files over 1GB, and depending on the business process; this can cause a lot of headaches. Due to this, I opted to use an Azure runbook to download files, then use a Flow to handle downstream processing. The scope of this post is to show how to download files from an SFTP and move them to blob storage.
Tooling being used: Azure Resource Group Azure Automation Account Posh-SSH Module Azure Runbook (PowerShell) + Hybrid Worker Azure Storage Account
Overview of what the script does: Connect to Azure Connect to the SFTP server Get a list of files from the server where the size is greater than 0 Loop through the files Create a temp folder Download the file from SFTP to the temp folder Create a new blob using the file from the temp folder Check if the blob was created If yes – remove the associated file from the SFTP server Clean up the SFTP session
#sftp items
$port = "22"
$Password = "my password"
$User = "sftp-user"
$hostName = "sftp.sharepointed.com"
$SftpPath = '/sftp_home/out/bigfiles/'
#blob storage related variables
$container = "mycontainer"
$blobFilePath = "myfolder/newfiles"
#get connection setup
$account = Connect-AzAccount -Identity -Confirm:$false | Out-Null
write-output $account
try {
$accKey = (Get-AzStorageAccountKey -ResourceGroupName "my rg name" -Name "my value")[0].Value
$context_storageAcct = New-AzStorageContext -StorageAccountName "my value" -StorageAccountKey $accKey
write-output $context_storageAcct
}
catch {
$errors += [PSCustomObject]@{Item = "azcontext"; Error = $_.Exception }
write-output "error get AZcontext"
}
#sftp setup
$secpasswd = ConvertTo-SecureString $Password -AsPlainText -Force
$Credentials = New-Object System.Management.Automation.PSCredential($User, $secpasswd)
write-output "====== Starting SFTP Session on $($hostName)"
$ssh = New-SFTPSession -ComputerName $hostName -Credential $Credentials -Port $port -Force -Verbose
write-output " +++ SFTP Session started on $($hostName)"
#get files from sftp
$listOfFiles = Get-SFTPChildItem -SessionId $ssh.SessionId -Path $SftpPath
$filteredFiles = @($listOfFiles | Where-Object { $_.Attributes.Size -gt 0 })
foreach ($file in $filteredFiles) {
try {
$blobPath = $($blobFilePath + "/" + $file.Name)
$localFolderBase = [System.Io.Path]::GetFileNameWithoutExtension($file.Name)
$localFolder = $($env:temp + "\" + $localFolderBase)
$fileToCopy = $($localFolder + "\" + $file.Name)
#create new folder to hold the downloaded file
New-Item -Path $localFolder -ItemType Directory -Force -Confirm:$false | Out-Null
#download file to local storage
Get-SFTPFile -SessionId $ssh.SessionId -RemoteFile $file.FullName -LocalPath $localFolder
#move file to container
Set-AzStorageBlobContent -File $fileToCopy -Container $container -Blob $blobPath -StandardBlobTier Cool -Context $context_storageAcct.Context -Force -Confirm:$false
#check if file was saved to the container
$blob = Get-AzStorageBlob -Blob $blobPath -Container $container -Context $context_storageAcct.Context -ErrorAction Ignore
if ($blob)
{
Write-output "Blob can be removed from SFTP"
Remove-SFTPItem -SessionId $ssh.SessionId -path $file.FullName -Force
}
}
catch {
write-output $_.Exception
}
}
#terminate the SFTP session
Remove-SFTPSession -SessionId $ssh.SessionID
The script can easily be expanded to trigger another runbook or move the files to SharePoint. One thing to note, my runbooks run on a Hybrid Worker, and this helps a lot with long-running jobs or jobs that are memory intensive.