Using the SharePoint HTTP flow action to update a person or group field, I kept getting this error:
A 'PrimitiveValue' node with non-null value was found when trying to read the value of a navigation property; however, a 'StartArray' node, a 'StartObject' node, or a 'PrimitiveValue' node with null value was expected.
The field I was attempting to update is named Submitted By, with an internal name of Submitted_x0020_By. Each time I tried to update the field I was seeing the error noted above. It wasn’t until I looked at one of my previous flow runs did I notice what the issue was. It turns out, that the field name I should be using is Submitted_x0020_ById.
Update flow:
How do you update a Person field if the field allows for multiple selections? The example below will update the field with two different user values, but clearly, this could be extended to be more dynamic.
from office365.runtime.auth.authentication_context import AuthenticationContext
from office365.sharepoint.client_context import ClientContext
from office365.sharepoint.files.file import File
app_settings = {
'url': 'https://YOURtenant.sharepoint.com/sites/somesite/',
'client_id': '12344-abcd-efgh-1234-1a2d12a21a2121a',
'client_secret': 'Oamytacohungry234343224534543=',
}
context_auth = AuthenticationContext(url=app_settings['url'])
context_auth.acquire_token_for_app(client_id=app_settings['client_id'], client_secret=app_settings['client_secret'])
ctx = ClientContext(app_settings['url'], context_auth)
web = ctx.web
ctx.load(web)
ctx.execute_query()
response = File.open_binary(ctx, "/Shared Documents/Invoice.pdf")
with open("./Invoice.pdf", "wb") as local_file:
local_file.write(response.content)
If the above script does not work, step back and ensure you are connected to the site. The following script connects to a site and outputs its title. This is useful to validate that a site connection can be made.
from office365.runtime.auth.authentication_context import AuthenticationContext
from office365.sharepoint.client_context import ClientContext
from office365.sharepoint.files.file import File
app_settings = {
'url': 'https://YOURtenant.sharepoint.com/sites/somesite/',
'client_id': '12344-abcd-efgh-1234-1a2d12a21a2121a',
'client_secret': 'Oamytacohungry234343224534543=',
}
context_auth = AuthenticationContext(url=app_settings['url'])
context_auth.acquire_token_for_app(client_id=app_settings['client_id'], client_secret=app_settings['client_secret'])
ctx = ClientContext(app_settings['url'], context_auth)
web = ctx.web
ctx.load(web)
ctx.execute_query()
print("Site title: {0}".format(web.properties['Title']))
You can also use a certificate and thumbprint to connect to SPO via an Azure App registration.
SharePlum connection example using a username and password to connect to SharePoint Online. More details about SharePlum can be found here: https://github.com/jasonrollins/shareplum
from shareplum import Site
from shareplum import Office365
sharepoint_url = 'https://YOURtenant.sharepoint.com/sites/spdev'
username = 'You@YourDomain.com'
password = 'Password'
authcookie = Office365('https://YOURtenant.sharepoint.com',
username=username,
password=password).GetCookies()
site = Site('https://YOURtenant.sharepoint.com/sites/DEV/',
authcookie=authcookie)
sp_list = site.List('Your List')
data = sp_list.GetListItems('All Items', row_limit=200)
If you get this error, you won’t be able to connect with a username and password, and you’ll need to use an App Password.
File “C:\Python311\Lib\site-packages\shareplum\office365.py”, line 80, in get_security_token raise Exception(‘Error authenticating against Office 365. Error from Office 365:’, message[0].text) Exception: (‘Error authenticating against Office 365. Error from Office 365:’, “AADSTS50076: Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to access ”.”)
Scenario: Each day I have a couple of Azure Runbooks export SharePoint list items and upload them to a SharePoint library. If one of the Runbooks fails, I needed to send an email alert that something went wrong.
Basic logic: If files created today in SharePoint <> X, send an email.
The easy solution would have been to loop through the files, check their created date, increment a variable, then make a condition statement.
More-better way: Run flow daily at 6:00 PM Send an HTTP request to SharePoint to get files Parse the response Condition statement — if true, send an email
Edit – If you want to search for files created before or after a date, you can adjust the API like this: and created %3E 2021-12-12T19:07:51.0000000Z This will fetch any files created after Dec 12th 2021. The unicode for greater than is %3E and less than is %3C
I’m in the process of reorganizing a document library and wanted to store all of the documents in alphabetical folders. Yes, I’m using metadata, but I’ve passed the magic 5,000 item threshold and want to rearrange the library and leverage a rich search experience.
So, using PowerShell, how do you create a bunch of folders going from A to Z?
Using a Power Automate Flow to break inheritance on a folder and this error was being returned. The issue turned out to be the path I was trying to use for the folder.
This did not work: LibraryName/Folder This DID work: /sites/ParentSite/SubSite/LibraryName/Folder’
As of today, there is not a Logic App trigger for Azure File Storage, so I went with a schedule-based approach. Yes, this example leaves out a lot of fine-tuning, but it will get you headed in the right direction.
Create a blank Logic app Trigger: Schedule Action: Azure File Storage – List files Action: SharePoint – Create file
After you add the SharePoint action, the Logic App should automatically add a For Each action and place the SharePoint Create File action inside of it.
Overview of the Logic AppFor each action expandedTesting the Logic App
In the last screenshot, I tested the Logic App by uploading a couple of documents in Azure Storage Explorer, then I manually ran the Logic App (click the Run button).
Again, this is a simple example. The example does not account for processing the same files over and over…