For creating custom reports on Office 365 content, the best approach is to fetch the Audit data from Office 365 Management Audit log, store it in a custom database and then create reports through it. In an earlier blog here, we looked at steps to retrieve Office 365 Audit log data using PowerShell. In this blog, we look at a similar process to gather audit data by using Office 365 Management API in Azure Functions.
Update 17 Aug 2019 – The code shared below gives a snippet of capturing Audit log data and is not a complete solution. For a complete solution, please check this github repo here.
Some of the key features of solution are as follows:
1. Azure Function 1.x – The solution repo uses Azure Function 1.x but could be upgraded to Azure Function 2.x. In case of using SharePoint Online CSOM, the solution might need Azure Function 1.x.
2. Microsoft.Azure.Storage.Common and WindowsAzure.Storage for using Azure Table operations
3. Newtonsoft.Json > 10.0.0.0 and SharePointOnlinePnPCore for using SharePoint Online CSOM
PS: There is another blog upcoming with more details about the set up required for starting to capture Audit log.
To start with, we will create an Azure AD app to connect to the Office 365 Audit log data store. Even though it might sound difficult, creating the Azure AD app is quite easy and simple. It is as simple as going to the Azure AD. Here is a quick blog with steps for the same.
After the Azure AD app is created, we will create an Azure Function (with Function code Authentication) to pull the data from Office 365 Azure Content blob, for doing that we will need to subscribe to the service first.
There are few prerequisites for setting up the Azure content blob service which are as follows:
- Enable the Audit log service in Security and Compliance center. This could be turned on (if not done already) via the Start recording user and admin activity on the Audit log search page in the Security & Compliance Center. This is going to be automatically On by Microsoft in future.
- Turn on the subscription service from the Office 365 Management Api. For this hit the below URL to start the subscription service on your tenancy. Replace the tenant Id with the tenant Id from Azure Active Directory
Next, back to the Azure Function, we will connect to the Azure subscription service using Azure AD app Id and secret using the below code. The below process is back and forth data pull from the Azure Content blob so read through the steps and code carefully as it might be a little confusing otherwise.
After connecting to the Azure subscription, we could request for content logs for a SharePoint events using a timeline window. Note that the date time are to be in UTC formats.
The detailed audit logs data are not provided in the initial data pull. The initial data pull from Office 365 Management Api returns the content URI to the detail audit log data. This content URI then provides the detailed audit log information hence the next step is a two-step process. The first step is to get the content blog URI details during the first call which then has the detailed log information URI to get the detail data entry from the Azure Subscription service.
Since the audit log data returned from the Office Management subscription service is paged, it is needed to loop through the NextPageURI to get the next URI for the next data pull.
The below code has the break up of data calls and looping for the next page URI. Brief overview of the code is as follows:
- Use the Do-While loop to call the initial data URI
- Call the initial data URI and get the response data
- Process the initial log data and convert to JSON data objects
- Get the ContentURI property and then call the data
- Next call the content URI to get the detailed audit log data
- After the data is fetched, convert to JSON data objects
- Add to the final data objects
After the data is retrieval is complete, the final could be stored in an Azure Table for further processing.
The above custom process using Azure Function and Office 365 Management API allows us to connect to the Audit log data through a custom job hosted in Office 365. After getting the data we could create reports or filter the data.