Retrieve Office 365 audit logs using Office Management API and Azure Functions

For creating custom reports on Office 365 content, the best approach is to fetch the Audit data from Office 365 Management Audit log, store it in a custom database and then create reports through it. In an earlier blog here, we looked at steps to retrieve Office 365 Audit log data using PowerShell. In this blog, we look at a similar process to gather audit data by using Office 365 Management API in Azure Functions.

Source CodeGithub Repo

Update 17 Aug 2019 – The code shared below gives a snippet of capturing Audit log data and is not a complete solution. For a complete solution, please check this github repo here.

Some of the key features of solution are as follows:
1. Azure Function 1.x – The solution repo uses Azure Function 1.x but could be upgraded to Azure Function 2.x. In case of using SharePoint Online CSOM, the solution might need Azure Function 1.x.
2. Microsoft.Azure.Storage.Common and WindowsAzure.Storage for using Azure Table operations
3. Newtonsoft.Json > 10.0.0.0 and SharePointOnlinePnPCore for using SharePoint Online CSOM

PS: There is another blog upcoming with more details about the set up required for starting to capture Audit log.

Steps:

To start with, we will create an Azure AD app to connect to the Office 365 Audit log data store. Even though it might sound difficult, creating the Azure AD app is quite easy and simple. It is as simple as going to the Azure AD. Here is a quick blog with steps for the same.

After the Azure AD app is created, we will create an Azure Function (with Function code Authentication) to pull the data from Office 365 Azure Content blob, for doing that we will need to subscribe to the service first.

There are few prerequisites for setting up the Azure content blob service which are as follows:

  1. Enable the Audit log service in Security and Compliance center. This could be turned on (if not done already) via the Start recording user and admin activity on the Audit log search page in the Security & Compliance Center. This is going to be automatically On by Microsoft in future.
  2. Turn on the subscription service from the Office 365 Management Api. For this hit the below URL to start the subscription service on your tenancy. Replace the tenant Id with the tenant Id from Azure Active Directory
    https://manage.office.com/api/v1.0/{tenant_id}/activity/feed/subscriptions/start?contentType=Audit.SharePoint

Next, back to the Azure Function, we will connect to the Azure subscription service using Azure AD app Id and secret using the below code. The below process is back and forth data pull from the Azure Content blob so read through the steps and code carefully as it might be a little confusing otherwise.

using System;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using System.Collections.Generic;
using Newtonsoft.Json;
using Microsoft.Azure.WebJobs.Extensions.Http;
using System.Threading.Tasks;
using Microsoft.IdentityModel.Clients.ActiveDirectory;
using Microsoft.Extensions.Logging;
string TenantID = <TenantID>;
string authString = "https://login.windows.net/&quot; + TenantID;
string SPServiceUrl = "https://manage.office.com/api/v1.0/&quot; + TenantID + "/activity/feed/subscriptions/content";
string resourceId = "https://manage.office.com&quot;;
string clientId = <Client App Id>;
string clientSecret = <Client App secret>;
var authenticationContext = new AuthenticationContext(authString, false);
ClientCredential clientCred = new ClientCredential(clientId, clientSecret);
AuthenticationResult authenticationResult = null;
Task runTask = Task.Run(async () => authenticationResult = await authenticationContext.AcquireTokenAsync(resourceId, clientCred));
runTask.Wait();
string token = authenticationResult.AccessToken;

After connecting to the Azure subscription, we could request for content logs for a SharePoint events using a timeline window. Note that the date time are to be in UTC formats.

The detailed audit logs data are not provided in the initial data pull. The initial data pull from Office 365 Management Api returns the content URI to the detail audit log data. This content URI then provides the detailed audit log information hence the next step is a two-step process. The first step is to get the content blog URI details during the first call which then has the detailed log information URI to get the detail data entry from the Azure Subscription service.

Since the audit log data returned from the Office Management subscription service is paged, it is needed to loop through the NextPageURI to get the next URI for the next data pull.

The below code has the break up of data calls and looping for the next page URI. Brief overview of the code is as follows:

  1. Use the Do-While loop to call the initial data URI
  2. Call the initial data URI and get the response data
  3. Process the initial log data and convert to JSON data objects
  4. Get the ContentURI property and then call the data
  5. Next call the content URI to get the detailed audit log data
  6. After the data is fetched, convert to JSON data objects
  7. Add to the final data objects
using System;
using System.Collections.Generic;
using System.Net.Http.Headers;
using System.Web;
using Newtonsoft.Json.Linq;
using System.Net.Http;
using Newtonsoft.Json;
using System.IO;
using Microsoft.Extensions.Logging;
using System.Linq;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Table;
using Microsoft.WindowsAzure.Storage.File;
using System.Text;
using CsvHelper;
using System.Threading.Tasks;
using System.Globalization;
using Microsoft.Online.SharePoint.TenantAdministration;
using System.Security;
using Microsoft.SharePoint.Client;
using System.Net;
//JSON object for Initial Call
public class AuditInitialReport
{
public string ContentUri { get; set; }
public string ContentId { get; set; }
public string ContentType { get; set; }
public string ContentCreated { get; set; }
public string ContentExpiration { get; set; }
}
public class AuditDetailedReport
{
public DateTime CreationTime { get; set; }
public string Id { get; set; }
public string Operation { get; set; }
public string Workload { get; set; }
public string ObjectId { get; set; }
public string UserType { get; set; }
public string UserTypeName { get; set; }
public string RecordType { get; set; }
public string RecordTypeName { get; set; }
public string UserId { get; set; }
public string EventSource { get; set; }
public string SiteUrl { get; set; }
public string Site { get; set; }
public string WebId { get; set; }
public string WebSiteName { get; set; }
public string ListId { get; set; }
public string ListName { get; set; }
public string ListItemUniqueId { get; set; }
public string ItemName { get; set; }
public string ItemType { get; set; }
public string SourceFileExtension { get; set; }
public string SourceFileName { get; set; }
public string SourceRelativeUrl { get; set; }
public string UserAgent { get; set; }
public string EventData { get; set; }
public string TargetUserOrGroupType { get; set; }
public string TargetUserOrGroupName { get; set; }
public string TargetExtUserName { get; set; }
public string UniqueSharingId { get; set; }
public string OrganizationId { get; set; }
public string UserKey { get; set; }
public string ClientIP { get; set; }
public string CorrelationId { get; set; }
}
public class AuditLogDataPull
{
string TenantID = <TenantID>;
string authString = "https://login.windows.net/&quot; + TenantID;
string urlParameters = $"?contentType=Audit.SharePoint&startTime={startDateString}&endTime={endDateString}";
// Loop through the Office 365 Management API call till the NextPageURI is null i.e. there are no pages left
do
{
// Get teh initial data entry for the data pull
auditInitialDataObject = getAuditInitalData(SPServiceUrl, urlParameters);
// Get the next page URI to form the next parameter call
if (auditInitialDataObject.AuditNextPageUri != "")
urlParameters = "?" + auditInitialDataObject.AuditNextPageUri.Split('?')[1];
//List of JSON objects from the initial data call
List<AuditInitialReport> auditInitialReports = auditInitialDataObject.AuditInitialDataObj;
// To increase performance call multiple endpoints at a time using Parallel loops
int maxCalls = 200;
int count = 0;
Parallel.ForEach(auditInitialReports, new ParallelOptions { MaxDegreeOfParallelism = maxCalls }, (auditInitialReport) =>
{
int loopCount = count++;
log.LogInformation("Looking at request " + loopCount);
// For brevity, have omitted the definition of AuditDetailedReport object. Please create this class and add variables to map
List<AuditDetailedReport> auditDetailReports = getAuditDetailData(auditInitialReport.ContentUri);
log.LogInformation("Got Audit Detail Reports of " + auditDetailReports.Count + " for loop number " + loopCount);
foreach (AuditDetailedReport auditDetailReport in auditDetailReports)
{
auditDetailReportsFinal.Add(auditDetailReport);
}
});
} while (auditInitialDataObject.AuditNextPageUri != "");
// Method to get the data for initial data pull
public AuditInitialDataObject getAuditInitalData(string SPServiceUrl, string urlParameters)
{
AuditInitialDataObject auditInitialDataObj = new AuditInitialDataObject();
try
{
List<AuditInitialReport> auditInitialReports = new List<AuditInitialReport>();
// **** Call the Http Client Service ****
HttpClient client = new HttpClient();
client.BaseAddress = new Uri(SPServiceUrl);
// Add an Accept header for JSON format.
client.DefaultRequestHeaders.Add("Authorization", "Bearer " + accessToken.ToString());
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
// List data response.
HttpResponseMessage response = client.GetAsync(urlParameters, HttpCompletionOption.ResponseContentRead).Result; // Blocking call!
if (response.IsSuccessStatusCode)
{
// Parse the response body. Blocking!
Stream dataObjects = response.Content.ReadAsStreamAsync().Result;
StreamReader reader = new StreamReader(dataObjects);
string responseObj = reader.ReadToEnd();
auditInitialReports = JsonConvert.DeserializeObject<List<AuditInitialReport>>(responseObj);
IEnumerable<string> values;
if (response.Headers.TryGetValues("NextPageUri", out values))
{
auditInitialDataObj.AuditNextPageUri = values.First();
auditInitialDataObj.AuditInitialDataObj = auditInitialReports;
}
else
{
auditInitialDataObj.AuditNextPageUri = "";
auditInitialDataObj.AuditInitialDataObj = auditInitialReports;
}
}
else
{
log.LogError($"{(int)response.StatusCode} ({response.ReasonPhrase})");
}
}
catch(Exception ex)
{
log.LogError($"Error while fetching initial Audit Data. Error message – {ex.Message}");
}
return auditInitialDataObj;
}
// Method to get the Audit Log data
// Note: The definition for Audit Detailed report class is neglected in this example here
public List<AuditDetailedReport> getAuditDetailData(string SPServiceUrl)
{
List<AuditDetailedReport> auditDetailData = new List<AuditDetailedReport>();
try
{
int retries = 0;
bool success = false;
// **** Call the Http Client Service ****
HttpClient client = new HttpClient();
string urlParameters = "";
client.BaseAddress = new Uri(SPServiceUrl);
// Add an Accept header for JSON format.
client.DefaultRequestHeaders.Add("Authorization", "Bearer " + accessToken.ToString());
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
// List data response.
HttpResponseMessage response = client.GetAsync(urlParameters, HttpCompletionOption.ResponseContentRead).Result; // Blocking call!
if (response.IsSuccessStatusCode)
{
success = true;
// Parse the response body. Blocking!
Stream dataObjects = response.Content.ReadAsStreamAsync().Result;
StreamReader reader = new StreamReader(dataObjects);
string responseObj = reader.ReadToEnd();
auditDetailData = JsonConvert.DeserializeObject<List<AuditDetailedReport>>(responseObj);
}
}
catch(Exception ex)
{
log.LogError($"Error while getting Detailed Audit Data. Error message – {ex.Message}");
}
return auditDetailData;
}
}

After the data is retrieval is complete, the final could be stored in an Azure Table for further processing.

Final Thoughts

The above custom process using Azure Function and Office 365 Management API allows us to connect to the Audit log data through a custom job hosted in Office 365. After getting the data we could create reports or filter the data.

Set up Accounts and secure passwords to run automation workloads in Azure Functions

In some of my previous blogs here, we have seen how we could use Azure Functions to to automate processes and SharePoint workloads.

Most of these jobs run using elevated or stored privileged accounts as the Azure Function is in a different context than the user context. There are various ways we could setup these accounts. Some of these approaches are below:

  1. Azure AD Service Accounts
    • Suitable for all operations
    • Need access to resource
    • Reusable across multiple workloads
  2. Azure AD Apps
    • Suitable for Graph Access
    • Need exact permissions set up
    • Might need Tenant Admin authentication
  3. SharePoint App Accounts
    • Suitable for SharePoint workloads.
    • Need Site and App specific privileges

The details of these accounts could be stored in the Azure Functions App Settings (for dev and production) or local.settings.json file during local development.

The most important consideration would be to prevent from exposing password details in the Azure functions in case of unauthorized access. There are two ways we could achieve this:

  1. Encrypting the password and store in the Azure Function (PowerShell)
  2. Using Azure Key Vault to store and access password details (C#)

Encrypting Passwords in Azure Functions

For doing this, first lets’ create an encrypted password using PowerShell using the script below.

## Create an Encrypted Password ##
$AESKey = New-Object Byte[] 32
[Security.Cryptography.RNGCryptoServiceProvider]::Create().GetBytes($AESKey)
Set-Content C:\Temp\<PasswordFileName>.key $AESKey
## Run the above code first, then upload the above file to the Function App##
Function Get-EncryptedPassword
{
param (
[Parameter(Mandatory=$true,HelpMessage='Please specify the key file path')][ValidateScript({Test-Path $_})][String]$KeyPath,
[Parameter(Mandatory=$true,HelpMessage='Please specify password in clear text')][ValidateNotNullOrEmpty()][String]$Password
)
$secPw = ConvertTo-SecureString -AsPlainText $Password -Force
$AESKey = Get-content $KeyPath
$Encryptedpassword = $secPw | ConvertFrom-SecureString -Key $AESKey
$Encryptedpassword
}
Get-EncryptedPassword -KeyPath C:\Temp\<PasswordFileName>.key -Password <Password>

Next, copy the file to a bin folder in Azure Function using Azure File Explorer (Application Settings -> App Service Editor) and decrypt using the code below

$userPass = $env:EncryptedPass
$keyPath = 'D:\home\site\wwwroot\<FunctionName>\bin\<PasswordFileName>.key'
$secPass = $userPass | ConvertTo-SecureString -Key (Get-Content $keyPath)

Using Azure Key Vault

For using Azure Key Vault, the steps are as below

  1. Create an Azure AD App and get the Client ID and Client Secret
  2. Create a Azure Key Vault and add the above Azure AD app to have Get Access to the key vault. The below permissions will suffix to read the secret.
    Azure Key Vault Secret Permissions
  3. Create Secret in key vault, then store the password and the secure Uri
  4. Store the Secret Uri, Client ID and Client Secret in Azure App Settings
  5. Use the below code to get the secure pass.
public SecureString GetSecret()
{
try
{
// Get the Secret Uri from Key Vault
string SecretUri = System.Environment.GetEnvironmentVariable("<KeyValutSecretUri>");
var kvToken = new KeyVaultClient(new KeyVaultClient.AuthenticationCallback(GetToken));
var kvSecret = kvToken.GetSecretAsync(SecretUri).Result;
SecureString secpass = new SecureString();
foreach (char charpass in kvSecret.Value)
{
secpass.AppendChar(charpass);
}
return secpass;
}
catch (Exception ex)
{
throw ex;
}
}
public async Task<string> GetToken(string authority, string resource, string scope)
{
AuthenticationResult authResult = null;
try
{
string ClientId = System.Environment.GetEnvironmentVariable("ClientId");
string ClientSecret = System.Environment.GetEnvironmentVariable("ClientSecret");
var authContext = new AuthenticationContext(authority);
ClientCredential clientCred = new ClientCredential(ClientId, ClientSecret);
Task authTask = Task.Run(async () => authResult = await authContext.AcquireTokenAsync(resource, clientCred));
authTask.Wait();
}
catch (Exception ex)
{
throw new Exception($"Function Error at Token Retrieval: {ex.Message}");
}
if (authResult == null)
throw new InvalidOperationException("Failed to obtain the JWT token");
return authResult.AccessToken;
}

Conclusion

Hence above we saw how we could set up accounts in Azure Function for elevated access to SharePoint and Resource locations.

Provisioning complex Modern Sites with Azure Functions and Flow – Part 3 – Post Provisioning Site Configuration

In the previous two blogs part 1 and part 2, we looked at steps to create a Modern team site and apply a custom provisioning template to it. In this blog, we will have a look at the steps for the post provisioning process to implement site specific requirements. Some of them could be:

1. Apply default values to list fields

2. Create a bunch of default folders

3. Manage Security groups (SP level) and permission level.

4. Navigation level changes

5. Add/Enable web parts or custom actions (SPFx extensions)

Most of the above steps are part of SharePoint Provisioning processes for a long time, just are less complex now with Provisioning templates doing a lot of heavy lifting. I will be writing very soon about PnP Templates, do and don’ts.

Pre-requisties

One key point to add is that, with Modern Team Sites and Office 365 Groups we cannot add AD security groups into a Office 365 Unified Group. For more information, please see this link.

The apply template process (link) takes about 45-90 min (long running process) for complex templates so wouldn’t be possible to start the post process on a flow wait state. Hence, we could trigger the post provisioning process on update of an inventory item or poll an endpoint for the status. In our case, we triggered the process when updating the inventory list with a status that apply template process is complete.

Post Provisioning Process

1. The first step of the Post Provisioning proces is to make sure that noscript is enabled on the team site (link) and all dependencies are ready such as Term stores, Navigation items, Site Pages, Content types, site columns etc. For a complex site template, this step will check for failure conditions to make sure that all artefacts are in place.

Note: The below sequence of steps could vary based on the solution and site structure in place 
but this was the faster way to isolate issues and ensure dependencies.

After the failure checks are done, we will start with the Site structure changes and navigation changes. For implementing navigation changes, check the blogs here and here.

2. Next, we will update any site specific site columns, site content types and permission level changes. A upcoming blog will have more details to this.

3. After that, we will update the changes for list structure, we will move to list specific updates such as default value setting, modifying list properties etc.

4. Next let’s move on to Apps and Site Pages updates. These changes take time because of SharePoint ALM lifecycle any possible duplications. So error handling is the key. Please check the blog here and for steps to deploy app and web parts here.

5. Before we finalize the site, let’s provision folders and metadata. This is not a simple process if you have to set metadata values for large number of folders like in our case 800 recursive folders (child in parent). So we will use the metadata file override. All the values in the defaults file have to be hardcoded before override.

Note: The metadata file override is not generally a good approach 
because of possible corruption that renders the library unusable
so do error handling for all cases. For the CSOM approach check here.

6. Finally, we will set the site Property bag values for Site to set some of the tags making the site searchable. Here is the blog for the same.

Conclusion

The above we saw the final process of Site Provisioning process with setting up site properties and attributes for preparing the site before handing it off to business for use.

 

Provisioning complex Modern Sites with Azure Functions and Flow – Part 2 – Create and Apply Template

In the previous blog here, we got an overview of the high level Architecture of a Complex Modern team site provisioning process. In this blog, we will look at the step 1 of the process – Create and Apply template process, in detail.

Before that, below are few links to earlier blogs, as a refresher, to prerequisties for the blog.

  1. Set up a Graph App to call Graph Service using App ID and Secret – link
  2. Sequencing HTTP Trigger Azure Functions for simultaneous calls – link
  3. Adding and Updating owners using Microsoft Graph Async calls – link

Overview

The Create and Apply Template process aims at the following

  1. Create a blank modern team site using Groups Template (Group#0 Site template)
  2. Apply the provisioning template on the created site.

Step 1 : Create a blank Modern team site

For creating a modern team site using CSOM we will use the TeamSiteCollectionCreationInformation class of OfficeDevPnP.  Before we create the site, we will make sure the site doesn’t already exist.

Note: There is an issue with the Site Assets library not getting intialized 
when the site is created using the below code. 
Hence, calling the EnsureSiteAssets library is necessary.
using Microsoft.Online.SharePoint.TenantAdministration;
using Microsoft.SharePoint.Client;
using OfficeDevPnP.Core.Sites;
using (ClientContext ctx = new ClientContext(<TenantAdminUrl>))
{
// create new "modern" team site at the url
// https://%5Btenant%5D.sharepoint.com/sites/mymodernteamsite
string alias = taskParam.TargetWebUrl.Substring(Url.LastIndexOf('/') + 1);
// Before creating let's make sure the site doesn't exist
bool exists = false;
Task existtask = Task.Run(async () => exists = await ctx.AliasExistsAsync(alias));
existtask.Wait();
if (!exists)
{
log.Info("Creating site with a new Async method");
TeamSiteCollectionCreationInformation teamSiteCreation = new TeamSiteCollectionCreationInformation();
// Provide the team site para
teamSiteCreation.Alias = <GroupAlias>;
teamSiteCreation.DisplayName = <WebTitle>;
teamSiteCreation.IsPublic = <true or false>;
teamSiteCreation.Description = <Description>;
ClientContext teamcontext = ctx.CreateSiteAsync(teamSiteCreation).GetAwaiter().GetResult();
teamcontext.Load(teamcontext.Web, w => w.Lists, w => w.Url);
teamcontext.ExecuteQueryRetry();
//Call the Ensure Site Assets Library method to initialize the Site Assets library
teamcontext.Web.Lists.EnsureSiteAssetsLibrary();
teamcontext.ExecuteQueryRetry();
log.Info("The site has been successfully created at: " + teamcontext.Web.Url);
}
}

Step 2:  Apply the Provisioning Template

Note: The Apply template process is a long running process and takes from 60-90 min to complete 
for a complex provisioning template with many site columns, content types and libraries. 
In order to prevent the Azure function from timing out, it is required to host the Azure Function 
using a App Service Plan instead of a Consumption plan so the Azure function 
is not affected by the 10 min time out. 

For the Apply Provisioning Template process, use the below steps.

1. Reading the Template

It is important to note that the XMLPnPSchemaFormatter version (in the code below) must match the PnP version used to generate the PnP template. If the version is older, then set the XMLPnPSchemaFormatter to read from the older version. In order to find the version of the PnP Template, open the xml and look at the start of the file

PnPTemplateVersion

using (Context ctx = new ClientContext(<siteUrl>))
{
string TemplateFilename = "<FileName>.xml";
ctx.Load(ctx.Web, w => w.Lists, w => w.Folders);
Microsoft.SharePoint.Client.File templateFile = ctx.Web.Lists.GetByTitle("Site Assets").RootFolder.Folders.GetByUrl("Templates").GetFile(TemplateFilename);
ctx.ExecuteQuery();
FileInformation fileInfo = Microsoft.SharePoint.Client.File.OpenBinaryDirect(ctx, templateFile.ServerRelativeUrl);
assetContext.ExecuteQuery();
filePath = Path.GetTempPath() + "\\" + TemplateFilename;
using (var fileStream = new FileStream(filePath, FileMode.Create))
fileInfo.Stream.CopyTo(fileStream);
}
using (FileStream fs = System.IO.File.OpenRead(filePath))
{
ITemplateFormatter formatter = XMLPnPSchemaFormatter.LatestFormatter; //Use the same version as the PnP template
Template = formatter.ToProvisioningTemplate(fs);
}

2. Apply the Template

For applying the template, we will use the ProvisioningTemplateApplyingInformation class of the OfficeDevPnP module. The ProvisioningTemplateApplyingInformation also has a property called HandlerToProcess which could be used the invoke the particular handler in the provisioning template process. Below is the code for the same.

using (var ctx = new ClientContext(<site url>))
{
ctx.RequestTimeout = Timeout.Infinite;
Web web = ctx.Web;
ctx.Load(web, w => w.Title);
ctx.ExecuteQueryRetry();
ProvisioningTemplateApplyingInformation ptai
= new ProvisioningTemplateApplyingInformation
{
ProgressDelegate = delegate (String message, Int32 progress, Int32 total)
{
log.Info(String.Format("{0:00}/{1:00} – {2}", progress, total, message));
},
IgnoreDuplicateDataRowErrors = true
};
web.ApplyProvisioningTemplate(Template, ptai);
}

After the apply template process is complete, since the flow will have timed out, we will invoke another flow to do the post process by updating a list item in the SharePoint list.

Conclusion

In this blog, we saw how we could create a modern team site and apply the template on it. The next blog we will finalize the process by doing site specfic changes after applying the template.

Provisioning complex Modern Sites with Azure Functions and Microsoft Flow – Part 1 – Architecture

In one of my previous blog here,  I have discussed about creating Office 365 groups using Azure Function and Flow. The same process could be used also to provision Modern Team sites in SharePoint Online because Modern Team Sites are Office 365 groups too. However, if you are creating a Complex Modern Team Site with lots of Libraries, Content types, Termstore associated columns etc. it will challenging to do it with a single Azure Function.

Thus, in this blog (part 1), we will look at the Architecture of a Solution to provision a complex Modern Team Site using multiple Azure Function and Flows. This is an approach that went through four months of validation and testing. There might be other options but this one worked for the complex team site which takes around 45-90 mins to provision.

Solution Design

To start with lets’ look at the solution design. The solution consists of two major components

  1. Template Creation – Create a SharePoint Modern Team site to be used as a template and generate a Provisioning template from it
  2. Provisioning Process – Create a SharePoint Inventory List to run the Flow and Azure Function. There will be three Azure Functions that will run three separate parts of the provisioning lifecycle. More details about the Azure Functions will in upcoming blog.

Get the Provisioning Template

The first step in the process is to  create a clean site that will be used as a reference template site for the Provisioning template. In this site, create all the lists, libraries, site columns, content type and set other necessary site settings.

In order to make sure that the generated template doesn’t have any elements which are not needed for provisioning, use the following PnP PowerShell cmdlet. The below cmdlet removes any content type hub association, ALM api handles and site security for provisioning requirements.

Get-PnPProvisioningTemplate -Out "<Folder location in drive>" -ExcludeHandlers ApplicationLifecycleManagement, SiteSecurity -ExcludeContentTypesFromSyndication

The output of the above cmdlet is ProvisioningTemplate.xml file which could be applied to new sites for setting up the same SharePoint elements. To know more about the provisioning template file, schema and allowed tags, check the link here.

ModernSitesProvisioningFlow_GetTemplate

Team Site Provsioning Process

The second step in the process would be to create and apply the template to a Modern SharePoint Team site using Flow and Azure Function. The detail steps would be as follows:

1. Create an Inventory list to capture all the requirements for Site Creation

2. Create two flows

a) Create and Apply Template flow, and

b) Post Provisioning Flow

3. Create three Azure Functions –

a) Create a blank Modern Team Site

b) Apply Provisioning Template on the above site. This is a long running process and can take about 45-90 min for applying a complex template with about 20 libraries, 20-30 site columns and 10-15 content types

Note: Azure Functions on Consumption plan have a timeout of 10 min. Host the Azure function on an App Service Plan for the above to work without issues

c) Post Provisioning to apply changes that are not supported by Provisioning Template such as Creating default folders etc.

Below is the process flow for the provisioning process. It has steps from 1 – 11 which goes from creating the site to applying it. The brief list of the steps are as follows

  1. Call the Create Site flow to start the Provisioning Process
  2. Call the Create Site Azure Function
  3. Create the Modern Team Site in Azure Function and set any dependencies required for the Apply template such as Navigation items, pages etc, and then return to flow
  4. Call the Apply Template Azure Function.
  5. Get the previously generated ProvisioningTemplate.xml file from a shared location
  6. Apply the Template onto the newly created Modern site. Note: The flow call times out because it cannot wait for such a long running process
  7. Update the status column in the Site Directory for the post provisioning flow to start
  8. Call the Post provisioning flow to run the Post provisioning azure function
  9. The Post provisioning azure function will complete the remaining SharePoint changes which were not completed by the apply template such as, set field default values, create folders in any libraries, associate default values to taxonomy fields etc.

ModernSitesProvisioningFlow_ProvisioningProcess

Conclusion:

Hence in the above blog, we saw how to create a provisioning process to handle complex modern team site creation at a high architectural level. Next, we will deep dive into the Azure functions to create, apply template and post process in the next upcoming blogs.

Happy Coding!!!