Office365-AzureHybrid: Building an automated solution to pull Office 365 Audit logs

Custom reporting with Office 365 Audit log data could be implemented using Audit Logs fetched from the Security and Compliance center. In the previous blogs here, we have seen how to use PowerShell and Office 365 Management API to fetch the data. In this blog, we will look at planning, prerequisites and rationale to help decide between the approaches.

The Office 365 Audit logs could be fetched from the Security and Compliance center once enabled. Currently logging is not enabled by default and needs to be enabled from the Security and Compliance center. This could be turned on (if not done already) via the Start recording user and admin activity on the Audit log search page in the Security & Compliance Center. This is going to be automatically On by Microsoft in future. Once enabled, the Audit information across all Office 365 services are tracked.

The Audit log search in Security and Compliance center allows to search the audit logs but is limited in what is provided. Also it takes a long time to obtain the results.

In order to provide efficiency and performance, the data could be pulled from Office 365 Audit logs but needs custom hosting.

Planning and prerequisites:

Few considerations for custom processes are as follows:

  1. Need additional compute to process the data – since the audit log data is huge and queries take a longer time, it is recommended to do a periodic job to fetch the data from the Office 365 audit log using a custom process. This could be done using a PowerShell job or Azure Function App as detailed below.
  2.  Need additional hosting for storing Office 365 Audit log data – The records could range from 5000 to 20000 per hour depending on the data sources and relevant data size. Hence to make it easier to retrieve the data later, it is advisable to store the data in a custom database. Since the data cost could be significant for this, it is recommended to use either dedicated hosting or NOSQL hosting such as Azure Tables/CosmosDB (Azure) or SimpleDB / DynamoDB (AWS)
  3. Might need additional Service Account or Azure AD App – Since the data will be retrieved using an elevated process, it is recommended to run it with an Azure AD app or service account to gather the data. For more information about this, please refer to this blog here.

Scenarios:

Some of the scenarios when the Office 365 Audit log data could be useful.

  1. Creating custom reports for user activities and actions
  2. Storing audit log data for greater than 90 days
  3. Custom data reporting and alerts not supported in Security and Compliance center

Approaches:

Below are few approaches to pull the data from the Office 365 Audit Logs. Also have listed the benefits and limitations of the approaches in order to help decide on implementation.

Using PowerShell

Search-UnifiedAuditLog of Exchange Online PowerShell could be used to retrieve data from Office 365 Audit log. More implementation details could be found at the blog here.

Benefits:

  1. No additional compute hosting required. The PowerShell could be run on a local system with a service account or on a server.
  2. One off data-pull could be done and retrieved when needed
  3. Data more than 90 days could be retrieved from Office 365 Audit log
  4. No session time out constraints as long the PowerShell console can stay active
  5. Local date formats could be used to fetch data

Limitations:

  1. Need Tenant Admin rights when connecting to Exchange PowerShell so the cmdlets could be downloaded
  2. Needs connection to Exchange online PowerShell every time the data needs to be retrieved
  3. Couldn’t be run on cloud using Azure or AWS as connection with Exchange Online PowerShell cmdlet is not possible in serverless environment
  4. Could run for longer period that could range to hours of continuous run

Using Office 365 Management API :

The Office Management API provides another way to retrieve data from Azure Logs using a subscription service and Azure AD App. For more detailed information, please check the blog here.

Benefits:

  1. Support of any language such as C#, Javascript, Python etc.
  2. Parallel processing allows greater speed and flexibility of data management
  3. Controlled data pull depending on data size to increase efficiency and performance

Limitations:

  1. Additional compute hosting required such as serverless workloads or web jobs to process the data
  2. Need an Azure AD app or OAuth layer to connect to the subscription service
  3. Need additional Time zone processing since all dates are in GMT for retrieving data
  4. Session timeout might occur in data pull involving large datasets. So advisable to use smaller time slot windows for data pull
  5. Multilevel data pull required to fetch the audit log. Please check the blog here to get more information

Final Thoughts

Both PowerShell and Office 365 Management Activity APIs are a great way to fetch Office 365 Audit log data in order to create custom reports. The above points could be used to decide on an approach to fetch the data efficiently and process it. For more details on the steps of the process, please check the blog here (PowerShell) and here (Office 365 Management API).

Retrieve Office 365 audit logs using Office Management API and Azure Functions

For creating custom reports on Office 365 content, the best approach is to fetch the Audit data from Office 365 Management Audit log, store it in a custom database and then create reports through it. In an earlier blog here, we looked at steps to retrieve Office 365 Audit log data using PowerShell. In this blog, we look at a similar process to gather audit data by using Office 365 Management API in Azure Functions.

Source CodeGithub Repo

Update 17 Aug 2019 – The code shared below gives a snippet of capturing Audit log data and is not a complete solution. For a complete solution, please check this github repo here.

Some of the key features of solution are as follows:
1. Azure Function 1.x – The solution repo uses Azure Function 1.x but could be upgraded to Azure Function 2.x. In case of using SharePoint Online CSOM, the solution might need Azure Function 1.x.
2. Microsoft.Azure.Storage.Common and WindowsAzure.Storage for using Azure Table operations
3. Newtonsoft.Json > 10.0.0.0 and SharePointOnlinePnPCore for using SharePoint Online CSOM

PS: There is another blog upcoming with more details about the set up required for starting to capture Audit log.

Steps:

To start with, we will create an Azure AD app to connect to the Office 365 Audit log data store. Even though it might sound difficult, creating the Azure AD app is quite easy and simple. It is as simple as going to the Azure AD. Here is a quick blog with steps for the same.

After the Azure AD app is created, we will create an Azure Function (with Function code Authentication) to pull the data from Office 365 Azure Content blob, for doing that we will need to subscribe to the service first.

There are few prerequisites for setting up the Azure content blob service which are as follows:

  1. Enable the Audit log service in Security and Compliance center. This could be turned on (if not done already) via the Start recording user and admin activity on the Audit log search page in the Security & Compliance Center. This is going to be automatically On by Microsoft in future.
  2. Turn on the subscription service from the Office 365 Management Api. For this hit the below URL to start the subscription service on your tenancy. Replace the tenant Id with the tenant Id from Azure Active Directory
    https://manage.office.com/api/v1.0/{tenant_id}/activity/feed/subscriptions/start?contentType=Audit.SharePoint

Next, back to the Azure Function, we will connect to the Azure subscription service using Azure AD app Id and secret using the below code. The below process is back and forth data pull from the Azure Content blob so read through the steps and code carefully as it might be a little confusing otherwise.

After connecting to the Azure subscription, we could request for content logs for a SharePoint events using a timeline window. Note that the date time are to be in UTC formats.

The detailed audit logs data are not provided in the initial data pull. The initial data pull from Office 365 Management Api returns the content URI to the detail audit log data. This content URI then provides the detailed audit log information hence the next step is a two-step process. The first step is to get the content blog URI details during the first call which then has the detailed log information URI to get the detail data entry from the Azure Subscription service.

Since the audit log data returned from the Office Management subscription service is paged, it is needed to loop through the NextPageURI to get the next URI for the next data pull.

The below code has the break up of data calls and looping for the next page URI. Brief overview of the code is as follows:

  1. Use the Do-While loop to call the initial data URI
  2. Call the initial data URI and get the response data
  3. Process the initial log data and convert to JSON data objects
  4. Get the ContentURI property and then call the data
  5. Next call the content URI to get the detailed audit log data
  6. After the data is fetched, convert to JSON data objects
  7. Add to the final data objects

After the data is retrieval is complete, the final could be stored in an Azure Table for further processing.

Final Thoughts

The above custom process using Azure Function and Office 365 Management API allows us to connect to the Audit log data through a custom job hosted in Office 365. After getting the data we could create reports or filter the data.

Retrieve Office 365 Audit logs using PowerShell and store in Azure table for quick retrieval

To create custom reports for Office 365 events, we could use the Audit logs from Security and Compliance center. The process is quite simple and could be implemented easily using PowerShell. In this blog, we will look at the steps for the same.

Later we will also see how we could store this data in a Azure Storage Table, so it is easy to fetch the data available.

Steps to fetch data from Office 365 Audit log using Exchange Online PowerShell

  1. The first step in the process is to import the commands from Exchange online PowerShell.

    In the above script, we are initializing the PowerShell session for Exchange Online PowerShell

  2. After the commands are imported, then we could search the audit log using the Search-UnifiedAuditLog command. Below is the cmdlet and some helpful information about the parameters.

    To get more information about more parameters here – https://docs.microsoft.com/en-us/powershell/module/exchange/policy-and-compliance-audit/search-unifiedauditlog?view=exchange-ps

  3. After the audit log data is pulled, the data could be formatted and updated to provide more relevant information about the audited information.
    For eg. RecordType and UserType information could be updated provide more information than just numbers.

Updating data into an Azure Table using Azure Storage PowerShell

After the data is processed and ready to be used from the above steps, we can either export this data to a CSV or store it in a Azure Table. For this blog, we will export this into CSV and then import it into an Azure Table. The benefits of Azure Table are as follows:

  1. Low cost storage
  2. Easy connection and data retrival
  3. NO SQL format allows storing information in multiple schema formats easily
  4. Data Types can be easily set and managed

Below is the script for the same.

Conclusion

In this blog, we will see how we could search Office 365 Audit, retrieve the data and then store it in a Azure Table for later use.

PowerShell scripts for SharePoint On Prem Analysis

— Updated 14 Jun 2017 – Added more scripts below (blue color) —

Recently working on a SharePoint 2010 to Online migration project, many PowerShell scripts were helpful for analyzing the SharePoint environment. Below are a list of those. I will be adding more when I built them, so that it is easy to find and run them. This is also helpful for analyzing your on-prem environment.

The Scripts can be found here in the github folder for reference – https://github.com/AsishP/SharePointOnPrem/tree/master/PowerShell

  1. Find Large Lists
  2. Export Site Collection Features
  3. Exports Web Features
  4. Check Windows vs Claims Authentication
  5. Check Language Packs
  6. Get the count of webs in a Site Collection
  7. Get list of all Web parts on sites and web
  8. Get web sites with last modified date information
  9. Get details of External Lists and BDC connections
  10. Get Empty SharePoint groups
  11. Get Broken Inheritance report
  12. Get Documents Greater than 50 MB
  13. Check Orphaned SharePoint Users (users who don’t exist in AD)
  14. Delete Orphaned SharePoint Users 
  15. Get Entries from SharePoint Audit Log
  16. Get last accessed documents from Document libraries

Walkthrough of Site Provisioning process using PnP PowerShell

In the previous blogs here, we have looked at the Provisioning process for a complex Team site. Much of complexity was easily handled by the PnP Provisioning process.

In this blog, we will look at the similar Provisioning process but from an Admin point of view and use PnPPowerShell for create and provision the site.

Steps:
The steps are actually quite simple and could be done quickly.

  1. Build a Template Site to be used for creating the Provisioning Template
  2. Manually apply changes to the Template site and extract the Provisioning template from the Template Site using the Get-PnPProvisioningTemplate
  3. Create the SharePoint Team site to apply the template
  4. Apply the Provisioning template to the above site using the Template obtained in Step 2
  5. Finalize the creation with any remaining changes.

Before we start looking at the above steps in detail, below are few key items to keep in mind during the provisioning process. Some of the these will become clear as we go through the detailed process steps

1. During the Get Template process, one of the key switches is to include/exclude any dependancy items which are not related to the new site. More details with Get-PnPProvisioningTemplate section

2. During the apply template process, only the differential changes are applied. Hence it is safe to apply the template many times or repeatedly in case of an error occurs during the process

3. Since the apply template process only adds differential items, any explicit changes for eg. left navigation default links removal needs to be done after the apply template process is complete.

4. The apply template works by matching the Provisioning Template schema with latest PnP code schema. Hence, if you are working with a old template then use the Schema switch as specified below.

Get Template Process

The first step would be to install the PnP PowerShell module. Check this blog here to see how to get started with PnP PowerShell.

Next create a template site that will be used to create the new sites.

After the site is created, we could create a template from it using the Get-PnPProvisioningTemplate cmdlet. Some of the key switches while creating the template are below.

Helpful Switches for Get-PnPProvisioningTemplate cmdlet

-ExcludeHandlers – Custom handlers to exclude settings or elements from the site sucha as ApplicationLifecycleManagement, Site Security, WebApiPermissions etc.
-ExcludeContentTypesFromSyndication – Exclude the Content Types pushed from the Hub. This is generally helpful if content types no longer persist and could cause conflict issues.
-ExtensibilityHandlers – Allows to specify ExtensbilityHandlers to execute while extracting a template for doing custom actions such as exporting pages schema into PnP export
-Handlers – Custom handlers to explicitly include settings or elements from the site such as Audit settings, Features, Fields, Search Settings, Term Groups etc.
-Schema – PnP Schema version of the exported template
-IncludeSearchConfiguration – Include the Search Configuration of the site
-PersistMultiLanguageResources – Persist Multilingual files

Apply Provisioning Template Process

After the template is extracted, we could apply this template to any newly created site. Before application of the template, we will create a new modern site using the PnP PowerShell command below

Next is to apply the template which takes about 20-25 minutes for a complex template with about 20 libraries, 30 site columns, 15 content types etc. Since the process takes so much time, it would be good to trace log any issues and error handle the application process.

The Apply Provisioning Template cmdlet details are below. It is important to note some of the switches for the cmdlet as it does apply some custom settings for the template.

Helpful Switches for Apply-PnPProvisioningTemplate cmdlet

-ClearNavigation – Clear the Navigation of the site before template is applied
-IgnoreDuplicateDataRowErrors – Ignores to stop the script because of duplicate daa errors
-ExcludeHandlers – Exclude the handlers when applying template
-ExtensibilityHandlers – Apply the extensibility handlers for the custom scripts

Conclusion

In this blog we saw how we could use PnP PowerShell and PnP Provisioning Template to apply custom template to a newly created site in order to automate the creation process.