Use Python to upload a LARGE file to SharePoint

In this post, I will quickly show how to use the Office365-REST-Python-Client library to upload a large file to a SharePoint library.

For this to work, you will need a certificate, Azure App registration, and access to the target SharePoint site. I outlined all the necessary parts in this post: Modernizing Authentication in SharePoint Online Note: the linked post will output a .PFX cert, and the script below will need a .PEM cert. You can use this Python command to convert the cert:

from cryptography.hazmat.primitives import serialization
from cryptography.hazmat.primitives.asymmetric import rsa
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import hashes
from cryptography import x509
from cryptography.hazmat.primitives.serialization import pkcs12

# Load the PFX file
pfx_file = open('C:\\path_to_cert\\EXAMPLE.pfx', 'rb').read()  # replace with your pfx file path
(private_key, certificate, additional_certificates) = pkcs12.load_key_and_certificates(pfx_file, None, default_backend())

with open('NewCERT.pem', 'wb') as f:
    f.write(private_key.private_bytes(
        encoding=serialization.Encoding.PEM,
        format=serialization.PrivateFormat.TraditionalOpenSSL,
        encryption_algorithm=serialization.NoEncryption()
    ))
    f.write(certificate.public_bytes(serialization.Encoding.PEM))

# install this library if needed
# pip install cryptography

Ok, with that out of the way, you can use this script to upload to a SharePoint library. In the script, I’ve commented out the line that would be used to upload to a folder within a library.

import os
from office365.sharepoint.client_context import ClientContext

cert_credentials = {
    "tenant": "abcxyz-1234-4567-8910-0e3d638792fb",
    "client_id": "abcddd-4444-4444-cccc-123456789111",
    "thumbprint": "7D8D8DF7D8D2F4DF8DF45D4FD8FD48DF5D8D",
    "cert_path": "RestClient\\NewCERT.pem"
}
ctx = ClientContext("https://tacoranch.sharepoint.com/sites/python").with_client_certificate(**cert_credentials)
current_web = ctx.web.get().execute_query()
print("{0}".format(current_web.url))

filename = "LargeExcel.xlsx"
folder_path = "C:\\code\py"

def print_upload_progress(offset):
    # type: (int) -> None
    file_size = os.path.getsize(local_path)
    print(
        "Uploaded '{0}' bytes from '{1}'...[{2}%]".format(
            offset, file_size, round(offset / file_size * 100, 2)
        )
    )

#upload to a folder
#target_url = "Shared Documents/folderA/folderB"

target_url = "Shared Documents"
target_folder = ctx.web.get_folder_by_server_relative_url(target_url)
size_chunk = 1000000
local_path = os.path.join(folder_path, filename)
with open(local_path, "rb") as f:
    uploaded_file = target_folder.files.create_upload_session(
        f, size_chunk, print_upload_progress, filename
    ).execute_query()

print("File {0} has been uploaded successfully".format(uploaded_file.serverRelativeUrl))

If you receive an error stating you don’t have access, double-check that you’ve added the App Registration to the target SharePoint site permissions. Again, this is noted in the blog post linked at the being of this post.

Consider this a workaround until MS Graph is out of its latest beta and there’s more support for easily uploading to SharePoint.

What if you need to upload a file and set a column value? When working with SharePoint via the API, you must be mindful of the column names. The column name in the UI might not be the same as the internal name, so I will use the script above as my starting point and add the following script to the end. In this example, I’m setting two fields: ReportName and ReportDate.

#get the file that was just uploaded
file_item = uploaded_file.listItemAllFields

# Define a dictionary of field names and their new values
fields_to_update = {
    "ReportName": "My TPS Report",
    "ReportDate": datetime.datetime.now().isoformat(),
    # Add more fields here as needed
}

# Iterate over the dictionary and update each field
for field_name, new_value in fields_to_update.items():
    file_item.set_property(field_name, new_value)

# Commit the changes
file_item.update()
ctx.execute_query()

print("Report fields were updated")

How do you get a list of all the columns in a list or library? The script below will output all the column’s internal and display names.

from office365.sharepoint.client_context import ClientContext

cert_credentials = {
    "tenant": "abcxyz-1234-4567-8910-0e3d638792fb",
    "client_id": "abcddd-4444-4444-cccc-123456789111",
    "thumbprint": "7D8D8DF7D8D2F4DF8DF45D4FD8FD48DF5D8D",
    "cert_path": "RestClient\\NewCERT.pem"
}

ctx = ClientContext("https://tacoranch.sharepoint.com/sites/python").with_client_certificate(**cert_credentials)
current_web = ctx.web.get().execute_query()
print("{0}".format(current_web.url))

# Get the target list or library
list_or_library = ctx.web.lists.get_by_title('TPS-Reports')

# Load the fields
fields = list_or_library.fields.get().execute_query()

# Print the field names
for field in fields:
    print("Field internal name: {0}, Field display name: {1}".format(field.internal_name, field.title))



SharePoint Audit Using Purview

Today, I had a customer ask the following question:
“How can I pull a report showing all of the files in a library that have not been viewed?”
Typically, users request a report showing all the items or files accessed, so this request was unique.

You can run an audit search on most everything in a cloud tenant using Microsoft Purview. Every file downloaded, viewed, list item opened, edited, deleted, page views, Onedrive actions, Teams actions, and the list goes on and on.

In Purview, click on the Audit link in the left nav, and it will open the audit search page.
Select the time range you want to target
Activities: Accessed file
Record types: SharePointFileOperation
Search name: this can be anything you want, i.e. SP Library Search
File, folder, or site: https://taco.sharepoint.com/sites/test/TheLibrary/*
Workload: SharePoint
The key items to note are the record type and file options. You can use a wildcard * to return results for everything in the target library. This will return much information, so filtering after the report is downloaded is needed. Once you’ve populated the fields, click Search, then wait a bit for it to complete. The amount of data in your tenant and current workloads will determine how long the search will take.


The completed search will look like this:

Clicking on the report name will open a detailed view of the audit search. From the results page, click the Export button and wait a few minutes for the file to be generated. If the page gets stuck at 0%, refresh your browser, and it should trigger the download.

Next, I needed to get all the files in the SharePoint library. To do this, I used PowerShell to connect to the target site and then downloaded the file info to a CSV.

# Connect to the SharePoint site interactively
Connect-PnPOnline -Url "https://taco.sharepoint.com/sites/test" -Interactive

# Specify the target library
$libraryName = "TheLibrary"

# Get all files from the target library
$files = Get-PnPListItem -List $libraryName -Fields "FileLeafRef", "ID", "FileRef", "Created", "Modified", "UniqueId", "GUID"

# Create an empty array to store the file metadata
$fileMetadata = @()

# Loop through each file and extract the relevant metadata
foreach ($file in $files) {
    $fileMetadata += [PSCustomObject]@{
        FileName = $file["FileLeafRef"]
        ID = $file["ID"]
        GUID = $file["GUID"]
        UniqueId = $file["UniqueId"]
        URL = $file["FileRef"]
        Created = $file["Created"]
        Modified = $file["Modified"]
    }
}

# Export the file metadata to a CSV file
$fileMetadata | Export-Csv -Path "C:\code\library_audit.csv" -NoTypeInformation

If you take anything away from this post, please take note of this: Purview uses a field named ListItemUniqueId to identify a SharePoint file or list item. My first thought was to use the GUID from the SharePoint library to match up to the Purview data. This is 100% incorrect! From SharePoint, UniqueId is the field that matches the Purview field ListItemUniqueId.

Basic logic:
SELECT SharePoint.*
FROM SharePoint
INNER JOIN Purview
ON SharePoint.UniqueId = Purview.ListItemUniqueId

I used Power BI to format and mash the exported Purview data with the SharePoint data. Power BI is unnecessary; you can easily use Power Query in Excel to do the same thing. Below, I’m including my M code statement that parses the JSON from the Purview file, and counts how many times the files were accessed and the last time they were accessed.

let
    Source = Csv.Document(File.Contents("C:\ian240329\Purview_Audit.csv"),[Delimiter=",", Columns=8, Encoding=1252, QuoteStyle=QuoteStyle.None]),
    #"Promoted Headers" = Table.PromoteHeaders(Source, [PromoteAllScalars=true]),
    #"Changed Type" = Table.TransformColumnTypes(#"Promoted Headers",{{"RecordId", type text}, {"CreationDate", type datetime}, {"RecordType", Int64.Type}, {"Operation", type text}, {"UserId", type text}, {"AuditData", type text}, {"AssociatedAdminUnits", type text}, {"AssociatedAdminUnitsNames", type text}}),
    #"Parsed JSON" = Table.TransformColumns(#"Changed Type",{{"AuditData", Json.Document}}),
    #"Expanded AuditData" = Table.ExpandRecordColumn(#"Parsed JSON", "AuditData", {"ListItemUniqueId", "SourceFileExtension", "SourceFileName", "ObjectId"}, {"ListItemUniqueId", "SourceFileExtension", "SourceFileName", "ObjectId"}),
    #"Removed Columns" = Table.RemoveColumns(#"Expanded AuditData",{"AssociatedAdminUnits", "AssociatedAdminUnitsNames", "RecordId"}),
    #"Renamed Columns" = Table.RenameColumns(#"Removed Columns",{{"ObjectId", "File URL"}, {"SourceFileName", "File Name"}, {"SourceFileExtension", "File Extension"}}),
    #"Filtered Rows" = Table.SelectRows(#"Renamed Columns", each ([File Extension] <> "aspx")),
    #"Filtered Rows1" = Table.SelectRows(#"Filtered Rows", each true),
    #"Removed Columns1" = Table.RemoveColumns(#"Filtered Rows1",{"Operation", "RecordType"}),
    #"Grouped Rows" = Table.Group(#"Removed Columns1", {"ListItemUniqueId"}, {{"View Count", each Table.RowCount(_), Int64.Type}, {"Last Viewed", each List.Max([CreationDate]), type nullable datetime}})
in
    #"Grouped Rows"

Still working in Power Query, I created a new query to show what SharePoint files had not been accessed. My Purview license is limited to 6 months‘ worth of data, so this is one hindrance to painting a full picture of what has/has not been accessed.

let
    Source = Table.NestedJoin(SharePoint_Audit, {"UniqueId"}, Purview_Audit, {"ListItemUniqueId"}, "Purview_Audit", JoinKind.LeftAnti),
    #"Expanded Purview_Audit" = Table.ExpandTableColumn(Source, "Purview_Audit", {"File Name"}, {"Purview_Audit.File Name"}),
    #"Sorted Rows1" = Table.Sort(#"Expanded Purview_Audit",{{"Purview_Audit.P File Name", Order.Descending}}),
    #"Filtered Rows1" = Table.SelectRows(#"Sorted Rows1", each true)
in
    #"Filtered Rows1"

With this data, I can now report to the user what files have not been accessed in the past 6 months.


Using New-PnPSite With A Multi Geo Tenant

If you try to run the PnP PowerShell command New-PnPSite using a managed identity or App Registration, in a multi-geo tenant, it will create the site in the default tenant. To get around this, you can use the PreferredDataLocation parameter to set the desired location, but you’ll also need to update your MS Graph permissions.


If you run the New-PnPSite command with the -PreferredDataLocation parameter and your permission are not correct, you will receive this error:

Code: Authorization_RequestDenied Message: The requesting principal is not authorized to set group preferred data location.

Open your App Registration and add the following MS Graph application permissions:
Group.Create, Group.ReadWrite.All, Directory.ReadWrite.All

Other people who had the same issue:
https://github.com/pnp/powershell/issues/2629
https://github.com/pnp/PnP-PowerShell/issues/2682
https://learn.microsoft.com/en-us/answers/questions/1099399/unable-to-create-modern-team-site-using-pnp-powers

Complete list of the geo codes can be found here:
https://learn.microsoft.com/en-us/microsoft-365/enterprise/multi-geo-add-group-with-pdl?view=o365-worldwide#geo-location-codes

Set of geo codes as of March 2024:

Modernizing Authentication in SharePoint Online

Starting a year or two ago, Microsoft announced it would stop supporting and/or blocking access to Azure Access Control Services (ACS) and the SharePoint Add-In model. This is important because ACS has been used for many years to grant app/script API access to a SharePoint site(s), and you likely have many sites where this has been used. Moving forward, Azure Access Control (AAC) will be used in place of ACS.

Historically, you would start the permissions journey by generating the client ID and secret at this endpoint:
_layouts/15/AppRegNew.aspx
From there, you grant the newly created identity access to a tenant, sites, lists, libraries, or a combination.
_layouts/15/appinv.aspx
The other option was to create an Azure App Registration and then grant it access to the target objects. When working with SharePoint Online and AppRegNew.aspx, the App Registration is generated automatically. Depending on what is/is not configured, this can be an issue and set off alarms in Azure.

With that out of the way, how do you wire-up a new connection to SharePoint Online, allowing PowerShell, Python, script, or app access to the SharePoint API or the Microsoft Graph API?

Overview of what I’m doing:
Create a self-signed cert
Add the cert to your personal cert store and upload it to Azure, creating an App Registration
Adjust permissions as needed
Grant the App Registration access to a specific SharePoint site
Use the newly created credentials to access the SharePoint site

#create cert with a password
New-PnPAzureCertificate `
    -CommonName "Demo_SP_Azure_2024" `
    -OutPfx c:\code\Demo_SP_Azure_2024.pfx `
    -OutCert c:\code\Demo_SP_Azure_2024.cer `
    -CertificatePassword (ConvertTo-SecureString -String "Taco@Good" -AsPlainText -Force) `
    -ValidYears 1

#import the cert. for this to work, run as Admin.
Import-PfxCertificate `
    -Exportable `
    -CertStoreLocation Cert:\LocalMachine\My `
    -FilePath c:\code\Demo_SP_Azure_2024.pfx `
    -Password (ConvertTo-SecureString -String "Taco@Good" -AsPlainText -Force)

I highly suggest not skipping the ‘-interactive’ part for the next command. It will open a browser window where you must authenticate with an account with adequate permissions to create a new App Registration in Azure. The script’s most important thing to note is SharePointApplicationPermissions Site.Selected. Why is this important? This is extremely useful if you want to limit permissions to a single SharePoint site and not every site in the tenant.

Register-PnPAzureADApp `
   -ApplicationName Demo_SP_Azure_2024 `
   -Tenant tacoranch.onmicrosoft.com `
   -Store CurrentUser `
   -SharePointApplicationPermissions "Sites.Selected" `
   -Interactive

After that runs, the output will include the Client ID and Certificate Thumprint. Take note of it, or you can open Azure, navigate to App Registrations, and select all applications. In the left nav, click Certificates & secrets, where you’ll find the thumbprint; again, in the left nav, click Overview, and you’ll see the Application ID, aka Client ID.

In the next two commands, you will connect to the SharePoint Admin site interactive, then grant the newly created App Registration write access to the target SharePoint site.

Connect-PnPOnline -Url "https://tacoranch-admin.sharepoint.com" `
    -Interactive

#grant the App Reg write access to the site
Grant-PnPAzureADAppSitePermission `
    -AppId 'a95ddafe-6eca-488a-b26a-dc62a64d9105' `
    -DisplayName 'Demo_SP_Azure_2024' `
    -Site 'https://tacoranch.sharepoint.com/sites/cert-demo' `
    -Permissions Write


Now that the App Registration can access the SharePoint site, how do you connect to it using the certificate and thumbprint?

Connect-PnPOnline `
    -Tenant "tacoranch.onmicrosoft.com" `
    -Url "https://tacoranch.sharepoint.com/sites/cert-demo" `
    -ClientId "a95ddafe-6eca-488a-b26a-dc62a64d9105" `
    -Thumbprint "5C5891197B54B9535D171D6D9DD7D6D351039C8E" 

Get-PnPList | Select-Object Title

Using the above commands, you can create a cert, register it in Azure, and grant access to a single SharePoint site.

I’ve included a copy of the full script here:
https://www.sharepointed.com/wp-content/uploads/2024/03/Azure-App-Reg-Cert-Demo.txt

Error(s) and fixes:
Error:
Grant-PnPAzureADAppSitePermission: {"error":{"code":"accessDenied","message":"Access denied","innerError":{"date":"2024-03-21T16:29:47","request-id":"","client-request-id":""}}}
Fix:
Ensure the account running this command Grant-PnPAzureADAppSitePermission , has access to the target SharePoint site.

Error:
Connect-PnPOnline: A configuration issue is preventing authentication - check the error message from the server for details. You can modify the configuration in the application registration portal. See https://aka.ms/msal-net-invalid-client for details. Original exception: AADSTS700027: The certificate with identifier used to sign the client assertion is not registered on application. [Reason - The key was not found., Thumbprint of key used by client: '6C5891197B54B9535D179D6D9DD7D6D351039D8Z', Please visit the Azure Portal, Graph Explorer or directly use MS
Graph to see configured keys for app Id 'eb7f9fbc-f4ee-4a48-a008-49d6bcdc6c40'. Review the documentation at https://docs.microsoft.com/en-us/graph/deployments to determine the corresponding service endpoint and https://docs.microsoft.com/en-us/graph/api/application-get?view=graph-rest-1.0&tabs=http to build a query request URL, such as 'https://graph.microsoft.com/beta/applications/aeb7f9fbc-f4ee-4a48-a008-49d6bcdc6c40']. Trace ID: 3e1e7ab3-60c0-4126-acb8-e2fdb6e28000 Correlation ID: 62ddc80b-aeb2-49c5-8c31-83a04e70bf6e Timestamp: 2024-04-01 12:21:59Z

Fix:
This one is easy; ensure you use the correct Client ID and thumbprint. You can get this from the app registration page in the Azure portal.

How to upload a large file to SharePoint using the Microsoft Graph API

What started as a simple question from a co-worker turned into a rabbit hole exploration session that lasted a bit longer than anticipated. ‘Hey, I need to upload a report to SharePoint using Python.’

In the past, I’ve used SharePoint Add-in permissions to create credentials allowing an external service, app, or script to write to a site, library, list, or all of the above. However, the environment I’m currently working in does not allow Add-in permissions, and Microsoft has been slowly depreciating the service for a long time.

As of today (March 18, 2024) this is the only way I could find to upload a large file to SharePoint. Using the MS Graph SDK, you can upload files smaller than 4mb, but that is useless in most cases.

For the script below, the following items are needed:
Azure App Registration:
Microsoft Graph application permissions:
Files.ReadWrite.All
Sites.ReadWrite.All
SharePoint site
SharePoint library (aka drive)
File to test with

import requests
import msal
import atexit
import os.path
import urllib.parse
import os

TENANT_ID = '19a6096e-3456-7890-abcd-19taco8cdedd'
CLIENT_ID = '0cd0453d-cdef-xyz1-1234-532burrito98'
CLIENT_SECRET  = '.i.need.tacos-and.queso'
SHAREPOINT_HOST_NAME = 'tacoranch.sharepoint.com'
SITE_NAME = 'python'
TARGET_LIBRARY = 'reports'
UPLOAD_FILE = 'C:\\code\\test files\\LargeExcel.xlsx'
UPLOAD_FILE_NAME = 'LargeExcel.xlsx'
UPLOAD_FILE_DESCRIPTION = 'A large excel file' #not required

AUTHORITY = 'https://login.microsoftonline.com/' + TENANT_ID
ENDPOINT = 'https://graph.microsoft.com/v1.0'

SCOPES = [
    'Files.ReadWrite.All',
    'Sites.ReadWrite.All'
]

cache = msal.SerializableTokenCache()

if os.path.exists('token_cache.bin'):
    cache.deserialize(open('token_cache.bin', 'r').read())

atexit.register(lambda: open('token_cache.bin', 'w').write(cache.serialize()) if cache.has_state_changed else None)

SCOPES = ["https://graph.microsoft.com/.default"]

app = msal.ConfidentialClientApplication(CLIENT_ID, authority=AUTHORITY, client_credential=CLIENT_SECRET, token_cache=cache)

result = None
result = app.acquire_token_silent(SCOPES, account=None)

drive_id = None

if result is None:
    result = app.acquire_token_for_client(SCOPES)

if 'access_token' in result:
    print('Token acquired')
else:
    print(result.get('error'))
    print(result.get('error_description'))
    print(result.get('correlation_id')) 

if 'access_token' in result:
    access_token = result['access_token']
    headers={'Authorization': 'Bearer ' + access_token}

    # get the site id
    result = requests.get(f'{ENDPOINT}/sites/{SHAREPOINT_HOST_NAME}:/sites/{SITE_NAME}', headers=headers)
    result.raise_for_status()
    site_info = result.json()
    site_id = site_info['id']

    # get the drive / library id
    result = requests.get(f'{ENDPOINT}/sites/{site_id}/drives', headers=headers)
    result.raise_for_status()
    drives_info = result.json()
    
    for drive in drives_info['value']:
        if drive['name'] == TARGET_LIBRARY:
            drive_id = drive['id']
            break

    if drive_id is None:
        print(f'No drive named "{TARGET_LIBRARY}" found')

    # upload a large file to
    file_url = urllib.parse.quote(UPLOAD_FILE_NAME)
    result = requests.post(
        f'{ENDPOINT}/drives/{drive_id}/root:/{file_url}:/createUploadSession',
        headers=headers,
        json={
            '@microsoft.graph.conflictBehavior': 'replace',
            'description': UPLOAD_FILE_DESCRIPTION,
            'fileSystemInfo': {'@odata.type': 'microsoft.graph.fileSystemInfo'},
            'name': UPLOAD_FILE_NAME
        }
    )

    result.raise_for_status()
    upload_session = result.json()
    upload_url = upload_session['uploadUrl']

    st = os.stat(UPLOAD_FILE)
    size = st.st_size
    CHUNK_SIZE = 10485760
    chunks = int(size / CHUNK_SIZE) + 1 if size % CHUNK_SIZE > 0 else 0
    with open(UPLOAD_FILE, 'rb') as fd:
        start = 0
        for chunk_num in range(chunks):
            chunk = fd.read(CHUNK_SIZE)
            bytes_read = len(chunk)
            upload_range = f'bytes {start}-{start + bytes_read - 1}/{size}'
            print(f'chunk: {chunk_num} bytes read: {bytes_read} upload range: {upload_range}')
            result = requests.put(
                upload_url,
                headers={
                    'Content-Length': str(bytes_read),
                    'Content-Range': upload_range
                },
                data=chunk
            )
            result.raise_for_status()
            start += bytes_read

else:
    raise Exception('no access token')

In the script, I’m uploading the LargeExcel file to a library named reports in the python site. It is important to note that the words drive and library are used interchangeably when working with MS Graph. If you see a script example that does not specify a target library but only uses root, it will write the files to the default Documents / Shared Documents library.

Big thank you to Keath Milligan for providing the foundation of the script.
https://gist.github.com/keathmilligan/590a981cc629a8ea9b7c3bb64bfcb417

How to Find Your Microsoft Forms Data: Locating the Linked Excel File

This started as a simple question: where is the backend Excel file for my group Forms form stored?

By default, a group form will save responses to an Excel file in the SharePoint site associated with the group. Within that site, the file is stored in the Documents, aka Shared Documents library.

Here is a quick way to track down the file:

With the form open, click on Response.

Click on Open in Excel.

Depending on how your SharePoint library is configured, the file will either download to your computer or open in the browser. Open the file and click on the name or click the down arrow next to it.

When the window opens, it will show exactly where the file is stored.

In this example, the file is stored in the Shared Documents library on the Testing site. Again, this example shows Shared Documents, but on the site, it’s actually named Documents.

STOP, I don’t see the window noted in the above screenshot! This more than likely means you are working with a personal form.

Where is the Excel file stored for personal forms? Not where you’d guess and not anywhere worthwhile. The file is more or less saved with the form and is inaccessible other than downloading it.

What if I copy my personal form to a group? What will happen to the Excel file?
Don’t do this; just recreate the form from scratch. The copied form will retain the behavior of storing the file with the form, not in SharePoint.

How can I save form responses to a SharePoint list or Dataverse table? You would need to create a Flow to intercept the form response and then save it to the destination.

Will creating a Flow that saves form responses to another destination impact the form saving to Excel? No, the form will always use the backend Excel file as its data storage.

If I download a copy of the backend Excel file, will the downloaded copy be updated with new form submissions? No, the copy is disconnected from the source.

Build Your Own ChatGPT-Style App Using Power Automate and Power Apps

Just for fun, I wanted to see how I could replicate a basic ChatGPT like experience using the tools available to me in the Power Platform. I will cover setting up the Power Automate flow to interface the OpenAI completions endpoint using GPT-4 Turbo and a Power App for the UI. In a future post, I will create another flow(s) to work with the Assistants API and take this to the next level.

A quick look at the app and how it works. The app will keep the chat context active using the completions endpoint until you want to clear it out. Like I said, it’s basic!

Here is what I used to get this up and running:
Power App
Power Automate flow with the HTTP connector
OpenAI account
OpenAI API key (credit card required)

Here is the layout of the flow and Power App


The key for this to work and keep the context of the conversation flowing is to pass all of the previous questions and responses back to the API each time a new question is asked. Yes, that can start to add up in terms of burning a lot of tokens for a large chat.

The input is converted to the following format and passed to the API using the app.
role: user
content: question asked
The response from the API is the same.
role: assistant
content: response from the API
Using the two, the collection is built and displayed in a gallery.

Power App setup
ButtonQuestion OnSelect property does the following:
Set the button text
Set the varQuestion variable to the value of TextInputQuestion
Add the question and user role to the colResponses collection
Convert the colResponses collection to JSON
Cal the f Get HTTP flow passing in the collection
with the response from the flow, add the value(s) to the colResponses collection
Reset TextInputQuestion
Set the button text

GalleryRespnses properties:
Visible: If(IsBlank(ThisItem.content),false,true)
Text: ThisItem.content
X: If(ThisItem.role = “assistant”,40,5)
AutoHeight: true

Now for the flow! The flow is triggered from the app and passes a value.
Messages: the new question from the app and previous responses and questions (collection)

The Parse JSON action will take the Messages input and shape the JSON used in the Select Messages action.

The Select Messages action will use the output of the Parse JSON action to help form the correct input for the HTTP action.

Using a Compose action, the model and messages values are set. Note: in the example, I’m using GPT 4 Turbo preview.

The HTTP Request action uses the following values:
Method: Post
URL: https://api.openai.com/v1/chat/completions
Headers:
{
“Content-Type”: “application/json”,
“Authorization”: “Bearer @{outputs(‘Compose_API_Key’)}”
}
Body: output of the Compose Message Body


Another Parse JSON action is used to handle the response from the HTTP Request.


In the Select Response action, the output of Parse JSON Response is used to populate the role and content values.

The final action is to respond to the Power App. The respond to PowerApp step will not work if you use the dynamic content value to populate the response. You must use this expression: outputs(‘Select_Response’)

Again, if data is not sent back to the Power App, chances are you missed the sentence before this one.

As I mentioned at the beginning of this post, this is a basic example. If you see anything wrong with the process or places where it could be streamlined, please let me know.

Power Pages Update SSL Cert and Website Bindings and Website Authentication Key

If you are here, you’re likely dealing with a Power Pages cert that’s about to expire or already has. Updating the cert is fairly straightforward, but the kicker is ensuring the user making the update has the correct permissions. When updating the cert, also check that the Website authentication key is still valid. Scroll to the bottom to see more about the auth key.

Permissions that are needed to update the cert and get the authentication key updated:
Owner on the App Registration associated with the Portal.
o365: Power Platform Admin, and Global Admin
As noted in the warning below, it states you can be a Global Admin or App Registration owner. Unless your sys admin is doing the update or if you are a sys admin, gaining access to the App Registration might be the easiest way to update the cert and re-bind the site.

Here are the steps I took to update my portals’ cert:
Uploaded the new cert under Manage custom certificates (note cert ID)
Navigate to the Set up custom domains and SSL page
Delete expired cert
In the SSL Bindings section, click +Add new and associate your host with the newly uploaded cert.

Here you can grab the Application ID and locate the associated App Registration.

Open the Azure portal, search for App Registrations, then search by the App ID. Once you have the App Reg., make sure you are an Owner. In the left nav of the app, click Owners, then add yourself or have an admin add you.

After that is updated, navigate back to the Power Pages admin portal and update the Authentication key. NOTE: the site will be offline for a few minutes.

Here are some helpful links related to this topic:
https://learn.microsoft.com/en-us/power-pages/admin/admin-roles
https://learn.microsoft.com/en-us/power-pages/admin/manage-auth-key


Effortlessly Trigger a Flow from a Power App: A Simple Step-by-Step Example

In this post, I want to show how easy it is to call a Flow from a Power App. The goal of the Power App is to pass values to the Flow, have it add them together, and return a result.

Starting from the Power Apps portal
click Create –> Blank app, Black canvas app, name the app, for the format option, select tablet, then click Create button.

Power App overview:

Field TypeField NameOption
Text InputTextInputOneFormat: Number
Text InputTextInputTwoFormat: Number
LabelLabelNumberOne
LabelLabelNumberTwo
LabelLabelTotal
LabelLabelMathResult
ButtonButtonCalc

Flow overview:
The Flow can be created directly in the Power App designer or the Power Platform portal. For this example, I’m going to use the portal.

From https://make.powerapps.com,
Click on New flow and select Automated cloud flow

Click the Skip button at the bottom of the window (this will make sense in a min.)

With the Flow designer open, click PowerApps or search for it, then click on PowerApps (V2)

In this step, add two number inputs to the action

I named my number inputs as follow: inputNumberOne and inputNumberTwo

The Flow will respond to the app using the Repost to a PowerApp or flow action. For the output, again select number, and I named mine outputNumber .

the formula should be: add(triggerBody()[‘number’],triggerBody()[‘number_1’])

Name the Flow as Flow do Math, and save it. You can test the Flow simply by clicking the Test button and supplying two input values. The Flow can be named something different, but this name aligns with the below example.

Back in the PowerApp, click the Power Automate icon.

With the Power Automate window open, click on Add flow and select the newly created Flow, or search for it and select it.

On the app design surface, select the button and update its OnSelect property to:
Set(varNumber, FlowDoMath.Run(TextInputOne.Text,TextInputTwo.Text).outputnumber)

Select the LabelMathResult field and set its Text value to varNumber

Run the app, input values in the text fields, then click the button.

What just happened?


The values of the two text input fields were passed to the Flow, it added them together and returned the value in the outputnumber field; that value was then set to the varNumber variable.

In future posts, I will dive deeper into more complex examples.



Get the display value of a dataverse choice field in Power BI

I’m working on a simple report to pull some data from Dataverse into a Power BI report. The data includes some choice fields, and when I first generated the report, I only retrieved the internal value of the fields. So my Status field values look something like this: 10000001 Cleary, no one wants to see this, and I needed to grab the display values: Ordered, Processing, Shipped

Part of the problem was related to using Native Query to pull in the data and not selecting the correct field. Meaning there are two or more columns for each choice field.
My query looked something like this:
Select title, status, createdon from orders where customerid = ‘875-6309’

The query should have been:
Select title, statusname, createdon from orders where customerid = ‘875-6309’

tl;dr
try placing the word ‘name’ directly after your choice field name.
status would be statusname
state would be statename