Harnessing Microsoft’s Business Application Platform

The Power Platform (now dubbed: Business Application Platform) started life as a collection of three products introduced into the Office 365 portfolio: [Power]Apps for lightweight business applications, [Power]Automate for business process automation, and [Power]BI for reporting & insights. It now has a fourth constituent dubbed [Power]Virtual Agents; a ‘low code’ solution to develop bots (for use in your forefront collaboration solutions like Microsoft Teams).

The platform rolls with a framework for the management of data and information that is shared by Microsoft’s Dynamics 365 service, called the Common Data Service and Common Data Model, respectively. Here’s where you can capture and interact with your data model if you’re not building solutions with a high synergy with SharePoint Online.

The Business Application Platform is a hot property right now, and organisations are looking for opportunities to evaluate and pilot it’s capabilities. I’ve seen a surge in requests for partner support to deliver business solutions powered by the platform.

So why, subject to a case by case evaluation, do I find myself concluding that in some scenarios, the Business Application Platform is not the right solution?

OK. Put the pitchforks down, and hear me out. I’m not a blind evangelist. I think the platform is great but that doesn’t mean it’s right for every scenario.

In this article, I’ll be examining what’s required to make the Business Application Platform a viable option for your organisation, and evaluating it against other comparative enabling technologies.

As a Service

The clue is in the name: Business Application Platform. It’s a platform capability. Is it a good idea to develop solutions for a platform that has not been properly embedded within your organisation?

I’ve seen organizations take the following approaches:

  • They ban/block usage of the Business Application Platform due to security concerns, predominantly around access to and usage of business data. (I realise this is less about the platform, and more a concern that existing security vulnerabilities might be exposed or exploited).
  • They enable the Business Application Platform, but restrict it to usage within a qualified group. This is a temporary situation, mitigating to concerns around who gets to deliver solutions on it, and more importantly, who supports those solutions.
  • They launch the Business Application Platform, perhaps with injected Change Management. Solutions start appearing and there is an implicit expectation around IT support, IT get nervous that they’re not fully across what’s happening out there.

Landfall: The legacy of Excel

The concern over who owns and supports what is nothing new. It was happening 20 years ago with Excel. Consider this scenario:

  • Excel is used in the Accounting team for bookkeeping
  • Alex, from Accounting takes a course in Visual Basic for Applications.
  • They decide to play with Excel and modify the bookkeeping workbook to automate some things.
  • It’s super effective! The rest of the team think it’s awesome. Alex makes more modifications and starts teaching the rest of the group.
  • Fast forward 6 months: what was a basic workbook is now a fully fledged business (critical) application.
  • Morgan, the head of Accounting, is getting nervous. What if Alex moves on? What if there’s a problem with solution and the team can’t fix it?
  • Morgan approaches Jules in IT support, with an expectation that they can support the solution and have the skills to do it….

The keyword here is expectation. And its established as part of a service governance plan:

Rule Number 1: Set expectations with the consumers of your service so they understand roles, responsibilities and accountability. Do this before you deploy the Business Application Platform.

This brings me to landfall. It’s the term I use to describe the process for transitioning technical ownership of a solution from a citizen developer (or business unit) to a formal IT support function. The Business Application platform is targeted at everyone, including business users, and trust me, you want to put it into their hands because you’re giving them tools to solve problems and be productive. In short: you need to define and communicate a process that transitions a solution from the business into IT support as part of your governance plan.

Rule Number 2: Define and communicate a process for landfall in your governance plan

You can design a foundation for the Business Application Platform that meets your requirements for delegation of administration, and in anticipation of transfer of ownership. For example: the creation of additional (logical) environment for IT owned and managed solutions that sits alongside the default sandbox environment that’s created with your tenancy.

Evaluating Solutions for the Business Application Platform

I work with customers to review business requirements and evaluate enabling technology. Often I see solutions masquerading as requirements driven by funding incentives, a need to innovate and adopt new technology and generate a case study. I get it.

There are some gotchas: considerations beyond whether the technology can deliver a solution. There are comparative enabling technologies and key differentiators, even within the Microsoft stack. For example:

Alternatives to PowerApps for presentation and data collection include Microsoft Forms, the SharePoint Framework, Single Page Applications or fully fledged web applications.

Alternatives for Power Automate include Azure Logic Apps (it shares the same foundation as Power Automate) and Azure Functions . You’ve also got commercial off the shelf workflow automation platforms such at Nintex and K2. Consider ongoing use of SharePoint 2010 or 2013 workflow in SharePoint Online a burning platform.

Power Virtual Assistants are an alternative to going bespoke with the Microsoft Bot Framework.

Rule Number 3: Evaluate requirements against comparative technologies with an understanding of key differentiators, dependencies and constraints.

So what are some of the key considerations?

Cost

The licensing model for the Business Application Platform is multi-tiered, so your license determines what’s in your toolbox. It might restrict use of specific connectors to line of business applications, the ability to make a simple HTTP request to a web service, or how a dashboard might be published and shared. Don’t commit to a Business Application Platform solution only to be stung with P2 licensing costs down the line.

Size and Complexity

Business Application Platform solutions are supposed to be lightweight. Like doing the COVID-19 check-in via Teams. Just look at the way you port solutions between environments. Look at the way they are built. Look at the way the platform is accessible from Office 365 applications and services. Large and complex solutions built for the Business Application Platform are arguably as hard to support and maintain as their ‘as code’ counterparts.

Synergy with Development Operations Cadence:

Let’s assume your organisation has an established Development Operations (DevOps) capability, and there’s a process in place for building, testing, and delivering business solutions, and tracking technical change. It may, for example advocate Continuous Integration and Delivery.

Along comes the Business Applications Platform, and a different method to build, deploy and port solutions built on the platform. It’s immediately at odds with your cadence. Good luck with the automation.

Technologies such as Logic Apps may be more suitable, given that solutions are built and deployed as code.

Synergy with Office 365

It’s not a hard constraint, but Business Application Platform solutions are a better fit if there is a high synergy with Office 365 applications and services. The current experience enables business users to build solutions from within the Office 365 ecosystem, and with an pre-defined context (e.g. a SharePoint Document Library).

Solutions that require integration with a broader set of systems may warrant the use of alternative enabling technologies, especially if additional plumbing is required to facilitate that connectivity. Do you break your principals around ‘low code’ solutions if there’s now a suite of Azure Functions on which your Flow is dependent run your business logic?

Ownership

Business users have an appreciation for a solution delivery lifecycle, but they’re not developers. The Business Application Platform is designed to empower them and is comes with tools required to design, build, test and publish solutions. Your decision to use the Business Application Platform informed by a strategy to have business users own and maintain their solutions. You get the foundations right in terms of security and governance, they’re not breaking the rules.

Maturity

Is the Business Application Platform an established service in your enterprise? If you’re leaning on partner support to crank out those solutions for you, are you ready to support and maintain them?

Low Code?

I see the term ‘no code’ or ‘low code’ everywhere. You don’t need developers! Anyone can do it! It’s cheaper.

Here’s a fact: It’s possible to build a monstrosity of a ‘low code’ solution and it’s possible to take a lot of time do to it. Try building a complex UI in PowerApps. Go on, try.

I prefer the term no-hassle instead. The Business Applications Platform is ready to use and all the technical bits are there. All you need is the license, and the skills. Keep it small and simple.

You want the ‘no hassle’ benefit to apply to service owners, administrators and consumers alike. There’s a balance here and decisions impacting one group may be to the detriment of the others.

Rule Number 4: Reach a consensus on the definition for ‘Low Code’ and what needs to be in place to realise the benefits


In summary, the Business Application Platform is a game changer but it needs to be delivered as a platform capability. The solution that’s going to net your your case study is the first of many that will follow, but not all solutions are right for the platform. Hopefully this article provides you with some pointers around how you evaluate the platform as a potential enabling technology; it’s a culmination of what I’ve learned.

Placing a Yammer Network into Forced Retirement

People are close to their social networks, and most see the network as representative of their brand and culture. If you kill the network, you’re perceivably killing a culture.

However, sometimes this needs to happen. I recall a recent scenario. Effectively an acquisition and merger, my mandate was to roll three disparate Yammer networks into one. Requests to collaborate here not there and incentives weren’t cutting it. People we holding on to what they felt was their heritage.

Two of these networks had to die to ensure the third thrived.

It’s a process I lovingly call strangulation.

It’s an appropriate metaphor. Social networks thrive on interactivity. To kill the network, you have to force a reduction in that interactivity; figuratively starve it of oxygen.

How do you do that? The key is in providing a strong enough incentive (and visual cues) to move the herd.

Can you just ‘Disable’ Yammer?

There is a documented process to turn Yammer off.

In our case, we wanted to ensure ongoing access to content in the network for a period of time leading up to tenant decommissioning. We also wanted to ensure users could log into the network and ‘springboard’ into other networks in which they were ‘guests’. Outright killing the network was not an option.

Note the following:

  • You cannot manually add or remove members from the All Company group (but you can restrict new conversations to admins only).
  • You cannot prevent people accessing Yammer or creating groups whilst they remain licensed.
  • Verified network administrators cannot access private messages and groups unless Private Content Mode is enabled (consult with your legal team before you enable this mode).
  • You can’t use a tool to migrate groups conversations to another Yammer network. Give users time to migrate supporting assets such as Files and Pinned Items (links) themselves.

The Importance of Change Management

This process enables ongoing visibility of content, until such time licenses are revoked. Retirement directly impacts the network in the following ways:

  • (non verified) Network and Group administrators are demoted, becoming regular community members in the network
  • Yammer groups are removed from the search scope. Only existing members retain access.

Migration Note: If users are transitioning to another network/tenant they are likely assuming a new identity. Any connections to heritage content (things they’ve posted, liked, people they followed) are severed in the transition.

This article is not about organisational change management. It’s about what you can technically do with the Yammer network to facilitate it’s retirement. However, I will state that you’ll only succeed with this process if you provide clear messaging leading up to, during and post retirement activity.

Pre-Retirement – It’s Time to Switch!
  • Explain that you’re retiring the network and the reasons why
  • Explain key terms such as ‘Deletion’ and ‘Archival’
  • If you can, publish an inventory of groups and nominated Owners (resolve scenarios where a group has no clear ownership, or there are many listed group admins).
  • Set our timelines for the retirement
  • Clearly set our what you expect Owners to do
  • Clearly set out what’s going to happen if Owners do nothing
  • Provide options (and support collateral) to help users move groups and content to other networks (if applicable)
  • Invite people to be proactive and relocate/re-establish Yammer groups (or outright delete them if they are no longer relevant).
  • Provide a support channel for the transformation.

Preparation

The Yammer Custodian

I recommend creating a Yammer Custodian generic user account to run the retirement. The account stays with the network after retirement and can be used by an admin to access content or manage settings for the network long after licenses for the user community have been revoked. The custodian has the following role:

  • It assumes a role as a verified network admin moving forward.
  • It assumes default ownership of any groups archived during the retirement
  • It is used to make announcements during retirement at network or group level (members have the option to ‘follow’ the custodian to keep informed of what’s happening).

Once you’ve created the Yammer Custodian account, head on over to your network’s Admin settings and make it a Verified Admin.

Accessing Private Content

By default, private groups remain inaccessible to Verified Admins. They have to request access like everyone else.

You have the option to set the network’s content mode to Private (Network Admin > Content Mode). This will enable the Yammer Custodian to access (and archive) groups marked as private in the Yammer network.

Consult with your legal team prior to enabling this mode. Alternatively, you can have the Yammer Custodian request permission and/or ignore private groups during the retirement.

Data Retention

By default, deleted content disappears from user view, but it will be retained in the database for 30 days,

You have the option to set the Data Retention Policy (Network Admin > Data Retention). Changing the setting to Archive will ensure anything deleted during retirement is retained for reporting & analysis.

Note: It is possible for content to be hard delved via the GDPR workflow or via an API call, even if the data retention policy is set to Archive.

Archiving a Group

It’d a good strategy to start with the groups with lowest member count first (low risk first).

You should perform the following tasks (signed in as a verified network administrator) in order to archive a Yammer group:

  • Add the Yammer Custodian to the group and promote it to admin

Perform the following tasks as the Yammer Custodian:

  • Revoke the admin role from each other group administrator (the Yammer Custodian should be the only group admin.
  • Append the term ‘(Archived)’ on the end of the group Name
  • If the group is public, switch it to private. If you do this, it will no longer be accessible to non-members but you’ll prevent anyone new from joining and/or posting. Existing members retain access.

(Removing groups from the directory and search results will prevent them from popping up in the Discover Groups Yammer feature).

  • Finally, post a message in the group to indicate it is archived:

Note: I recommend posting as an Update instead of an Announcement. An announcement generates email to all group members. Given you can archive groups quickly, you’ll spam them with messages from the Yammer Custodian. Impacted users with high resistance will consider this equivalent to a ‘knife through the heart’. Announcements should be by exception.

Network Configuration

With all the Groups (with the exception of All Company) archived, it’s time to make some changes at the network level:

  • Switch to the All Company group and open Settings. Append ‘(Archived)’ on the end of the group Name and set the Posting Permissions to Restricted.
  • In Network Admin > Configuration > Basics, set the Message Prompt to some kind of deterrent. Be creative!
  • In Network Admin > Configuration > File Upload Permissions, set the File Upload Permissions to block all files.

There you have it. A process to retire (not outright kill) your Yammer network. The process documented here is mature, having been employed in support of a large scale tenant migration. It proved highly effective, but not without supporting organisational change management efforts.

Generating Document Metadata using the Power Platform (part 2)

AI Builder

(This is part two of a two-part article covering AI Builder and the Power Platform. I recommend you skim through part one for some much needed context).

In part one of this two-part series we created an AI Builder model using the Form Processing model type and trained it to extract and set the values for selected fields in documents of a specific type (in my case: a statement of work).

We now have the blueprint for a Content Type for use in SharePoint Online:

schema

In this article (part two of the series), we’ll be creating a Flow using Power Automate. It will use our AI Builder model to extract and store metadata for statements of work uploaded to a SharePoint Online.

(I’ll continue to use the statement or work as an example throughout the series. Feel free to substitute it and it’s selected fields, for something more applicable in your own organisation).

Prerequisites

Before we create the Flow, we’ll need to perform the following actions (the Site Owner role will give you sufficient rights to complete this work). I don’t want to go into detail here but I’ve linked to supporting articles if you’re new to SharePoint Online.

With the prerequisite setup complete, our Document Library settings should look something like this:

With that done, we’re ready to build our Flow!

Design

Before I build a Flow using Power Automate, I like to sketch a high level design to capture any decision logic and get a better feel of what the workflow has to do. We can worry about how these steps work when we implement. Here’s what I want the Flow to do:

There are three sub-processes in the Flow. The trigger will be the creation of a document in our Document Library (manual or otherwise). If that document isn’t in pdf format, we’ll need to convert it. This is because AI Builder’s Form Processing model does not support documents in native (Microsoft) Office format. Finally, we’ll need to send the document off to AI Builder so it can be analysed.

Creating the Flow

OK, let’s get building our Flow. The solution is technically a ‘no-code’ solution but we’re going to create some expressions to handle things like token substitution. Think: Excel, not Visual Studio Code.

Creating a Flow is easy. Simply head on over to flow.microsoft.com, sign in and hit + Create. Our Flow will kick-in when someone uploads a document to SharePoint Online, so select the Automated flow template:

The first building block of an automated flow is the trigger. Select (or search for) the trigger called When an item is created or modified and hit Create.

Configuring the trigger is easy. Simply pick your Site Address from the list provided, and specify the List Name (set a Custom Value if Power Automate is having a hard time resolving your List Name, as it did frequently for me). Click + New Step when ready.

Handling Variables

Next, we need to create some variables to store values we’ll need to reference along the way. The Initialize Variable action here to help, so we’ll create one for each variable we need.

  • Initialize a variable called FolderPath (type: string) and set it’s value using the following expression:
substring(triggerOutputs()?['body/{Path}'],0,add(length (triggerOutputs()?['body/{Path}']),-1))

(this is removing the trailing ‘/’ from the relative path to the Document by using the List Item’s native Path property. It sucks but we need pass this value to a web service later in the Flow and it just breaks if you keep that trailing ‘/’ there. )

  • Initialise a variable called PDFUrl (type: string) and leave it’s Value blank for now (this will store a reference to the converted PDF we create later in the Flow).
  • Initialize a variable called PDFContent (type: object) and leave it’s Value blank for now (this will store the content to be passed to AI Builder for conversion later in the Flow).

At this point in the process, your Flow should look like this:

(Note: You can rename your triggers and actions to help with readability, as shown in the image above).

Decision Logic

Next, we need another action to perform a check to see if our document is in pdf format. Add another action and search for: Condition. We can use the File Name property in Dynamic content (created by our trigger) and simply check the last three characters in that value (i.e. the file extension).

At this point our Flow looks like this and must split into two. The ‘yes’ branch (in cases where the document is is pdf), and the ‘no’ branch (anything else). We’ll handle the ‘yes’ branch first as it’s the easiest.

The ‘Yes’ Path

We’ll need to create a reference to the content of the document and pass that the AI Builder for analysis. The good news is that if the document is already in pdf format, we can use the Get file content using path action and then assign the value from that to the PDF Content variable we created earlier in the Flow:

  • Add a Get file content using path action to the Yes branch of the Flow. Set the File Path to the Full Path property (under dynamic content) that was created by our trigger.
  • Next, add a Set Variable action. We’re going to assign the File Content property (under dynamic content) created by the previous action to the PDF Content variable we initialized earlier.

Still with me? Your setup for the ‘yes’ branch’ should look like this:

The ‘No’ Path

It gets a little tough here, but stick with me. If the document is not in pdf format, we must convert it.

There are a number of options here, including the use of third party actions you can purchase to handle conversion for you. But there is a way to use SharePoint Online to convert the document for you. For this solution I’ve drawn from prescriptive guidance published by Paul Culmsee (@paulculmsee) to help with my solution. Full credit to him for figuring this stuff out!

Conversion works like this:

  • We issue a request to SharePoint Online for the document payload (including valuable meta-data)
  • We parse the resulting JSON so that the meta-data we need is cached by Flow as dynamic content
  • We assemble the URL to the converted pdf taking bits from the JSON payload.
  • We issue a HTTP request for the converted pdf and store the body of the response in our PDFContent variable.

OK, let’s go:

  • Add a Send a HTTP request to SharePoint action to the No branch of the Condition. The Uri we want to invoke is as follows:
_api/web/lists/GetbyTitle('<LIST_NAME>')/RenderListDataAsStream?FilterField1=ID&FilterValue1=<DOCUMENT_ID>

(We need to swap out the <LIST NAME> with the name of your Document Library and the <DOCUMENT ID> with the ID property of the document that was uploaded there. Fortunately the ID property is available as dynamic content).

The parameters to this request must contain instructions to return a Uri to the pdf version of the document, so we include the following in the body of our request:

{ 
    "parameters": {
       "RenderOptions" : 4103,
       "FolderServerRelativeUrl" : "/sites/sandpit/@{variables('FolderPath')}"
    }
}

(here we need to send the FolderPath with trailing ‘/’ removed, so we’re using the FolderPath variable we set at the start of the Flow).

So your action configuration should look similar to this:

Next, we need a copy of the payload schema returned by this service call so we can reference it as dynamic content in our Flow. The easiest way to do this is to Test the Flow and copy the schema. Hit the Test button and select I’ll perform the trigger action and hit Save and Test.

Flow will wait patiently for you to upload a document to your Library to trigger the Flow. Once done, steps in your flow will be ticked off as they are tested. Examine the Send a HTTP to SharePoint action and copy the body content to your clipboard (we’ll use this to create the next action)

  • Switch back to edit mode and add a Parse JSON action. We’re going to assign the body property (under dynamic content) created by the previous action to the Content property. To set the schema property, select Generate from sample and paste the body content you copied to your clipboard when you ran the test.

Your action configuration should look similar to this:

We need to parse the output of the HTTP request because we’re using key values in the payload to create the Url to the converted PDF in our next step.

  • Add a Set Variable action. We’re going to use the following expression to assemble our Url:
@{body('Parse_JSON')?['ListSchema']?['.mediaBaseUrl']}/transform/pdf?provider=spo&inputFormat=@{first(body('Parse_JSON')?['ListData']?['Row'])?['File_x0020_Type']}&cs=@{body('Parse_JSON')?['ListSchema']?['.callerStack']}&docid=@{first(body('Parse_JSON')?['ListData']?['Row'])?['.spItemUrl']}&@{body('Parse_JSON')?['ListSchema']?['.driveAccessToken']}

Your action should look like this:

  • The .mediaBaseUrl property contains the URL to the platform’s media content delivery service (e.g. australiaeast1-mediap.svc.ms)
  • The first expression parses the file type, to pass the format to convert from (e.g. docx)
  • The second expression parses the uri to the pre-converted document in SharePoint Online.
  • The .callerStack and .driveAccessToken properties are encoded strings (tokens) providing useful session context.

(If you want, Test the Flow at this stage. Copy the output of the Set PDF Url action into your browser. If it’s correct, it will display a PDF version of the document you submitted at the start of the test).

  • Next, we need the Flow to request the converted pdf directly. Add a HTTP action to your Flow. (Astonishingly, this action requires a premium license). Assign the PDFUrl variable to the URI property.

Your action should look like this:

  • The last action in your ‘no’ branch will assign the body of the response generated by the HTTP action (as dynamic content) to the PDFContent variable we initialized at the start of the Flow.

Your action should look like this:

Quick Recap

At this stage, your Flow has two branches based on a test to see if the document format is of type pdf. Each branch ultimately set’s the PDFContent variable we’ll use for the final step. In the yes branch, the content is a snapshot of the document uploaded to SharePoint Online. In the no branch, it’s a snapshot of a pdf conversion of that document. Here’s what it should look like:

Calling AI Builder

The final step of the flow occurs when the two branches converge. We’ll invoke the AI Builder model we created in part one of this article and then update the document’s List Item once we have the meta-data.

  • Search for an action called Process and save information from forms and add it to your Flow where the conditional branches converge. Select your model from the list provided. Use the PDFContent variable (as dynamic content) here.

Your action should look like this:

  • Next, we need to take the output of the analysis and use it to update the column data for the statement of work content type in SharePoint Online. Add an Update Item (SharePoint) action to the Flow. Once you specify the List Name the fields will be loaded in for you. Simply use the dynamic content panel to assign values returned from AI Builder to your columns.

Your action should look like this:

As a stretch goal, you can add some resilience to your Flow by adding some conditional logic when you update your list item. AI Builder passes it’s confidence scores back to Flow, so you could, for example, update the list item only if the confidence score is within a specific threshold. For example:

The Final Solution

Your final solution should look something similar to this:

As statement of work documents are uploaded to the Document Library, Flow will pick them up and push the content to your model in AI Builder for analysis. The result is a complete set of metadata for each document without the need to manual intervention.

Final things to note:

  • Processing is asynchronous and will take a few seconds (especially if conversion is needed), so the meta-data will not be available immediately after the document is uploaded. My Flow completed on average in about 15-25 seconds. Your mileage may vary.

Wow. That was quite a lot. Thanks for sticking with me. In this series, we trained and published a model using AI Builder and then created a Flow using Power Automate to invoke it.

The solution is able to analyse documents at they are uploaded to SharePoint Online. Column values are extracted from the document as AI Builder runs the analysis. The Flow updates the associated List Item, assigning values to our Columns.

Generating Document Metadata using the Power Platform (part 1)

AI Builder

I hate filling in forms. Really, I do. Imagine if you had to fill in a form each time you uploaded a document into SharePoint Online? You do? I feel for you.

For some time I’ve been wanting to look at how my organisation can use tools such AI Builder to help auto-generate metadata for types of documents we store and manage in SharePoint Online.

At the time of writing, AI Builder is about 12 months old following it’s preview, and it’s been aligned to the Power Platform. The solution includes a form processing AI model, which can be trained to extract named properties from your documents.

I wanted users to be able to search, sort and filter on key pieces of information contained in our statements of work (contracts). The logical solution was to create columns to store this information, given the content is hosted in SharePoint Online. From there, I could convert these columns into managed properties, for use in search refiners and as referenceable values in other solutions.

This is part one of a two-part series designed to walk you through the steps to develop a no-code solution to analyse documents and automatically extract and set associated metadata. (you can find part two here). Our solution will use the following building blocks:

  • SharePoint Online
  • AI Builder
  • Power Automate

From a licensing perspective, you’ll need a Power Automate P2 (Premium) to re-create my solution and an E1 licence or better to configure the SharePoint Online components.

This article (part one) covers the AI Builder component. Part 2 covers the SharePoint Online / Power Automate component.

Create your Model

You can access AI Builder via PowerApps or Power Automate. Here you can select a pre-fabricated model and train it. The models you train and publish are stored centrally and visible to all your Power Platform solutions so it doesn’t matter how you access AI Builder.

We’re going to use the Form Processing model and train it to extract information from our statement of work documents.

Some important things to note:

  • At the time of writing, AI builder can only analyse content in pdf, png, jpg and jpeg format (don’t worry, our solution will handle conversion from native Office document formats).
  • Start with 5 examples of the document you want the solution to analyse. Ensure they contains examples of the meta-data you want to extract. This is because the field selection process targets a single document and you don’t get to choose which one if you start with many documents.
  • For a reliable model, upload at least 5 other training documents after field selection. Manual effort is needed to teach AI Builder how to analyse each document. Put these in the same location so you can bulk upload them in one action.
  • It’s important to introduce some variety here (in my case, there were several key variations of our statement of work I wanted AI Builder to recognise). A set of completely different, unrelated documents will skew your model.

Upload

First, convert any training documents to pdf format if they are in a native Office document format.

You’ll be prompted to upload your documents. Remember you can add more training documents later if you like.

Once you’ve uploaded your documents, hit that Analyse button to give AI Builder an initial look.

Review

Now you’ll have the opportunity to identify the metadata you want to extract from similar documents. AI Builder may take the initiative and create some for you. In any case, keep what you want. You can come back after saving the model to repeat this process.

AI Builder will present an example document to you for review. Simply highlight the content you wish the model to extract from the document and create an associated field name. As mentioned earlier, you can to make sure you can find all your fields in this document.

When you’re done with the document, hit that Confirm Fields button and you’re ready to review the rest of your payload.

This review is both manual and sequential, one document at a time. Your progress is tracked as you go. You’ll need to identify and assign values to each field you created.

(In cases where a field does not occur in one of your documents, or it’s value is not set, then you can select the Field not in document option).

Note that if you add more documents to the model, you’ll need to run through this process for each new document you added. You won’t have to repeat the process for documents already reviewed unless you add, modify or remove a selected field.

At the end of the review process, you’ll be able to click the Done button. On clicking Next you’ll be presented with a model summary.

Training

The final step before testing is to Train your model. This will save it and the training will happen in the background. Keep an eye on the Status. It will read Trained when training is complete. You’re now ready to test your model before publishing it.

Testing

So here I have my model that I have lovingly called SOW_Processing . I needed 20 documents to fully train my solution. There are 12 selected fields my model will extract during analysis. Your own end state will of course vary.

Listing the field names here is handy, as we’ll need a Content Type to represent a statement of work and Columns to hold this metadata in our SharePoint Online solution. We’ll need a blueprint for the SharePoint Online setup. Let’s go with this:

Here we’re testing the reliability of the model. It’s not fit for purpose if, for example, it can’t identify correct values for the selected fields in a statement of work presented to it.

The Quick test allows us to see how well a model can identify the correct metadata in documents we present to it for analysis. Make sure you use new documents (not used to train the model) for testing.

Upload a document and have the model analyse it. On scanning the document, you’ll find parts of it highlighted, indicating that the model has found a value for one of it’s selected fields.

The Confidence score is important. This represents the degree of certainty the model has that the value for the selected field is correct.

Some important things to note:

  • Training is an iterative process. You’re supposed to re-train your model over time, introducing new variations to it.
  • Set a minimum confidence threshold for each selected field as part of your acceptance criteria for the model.
  • Scores between 80 and 100 are good. You cannot guarantee a confidence score of 100 even if you train your model extensively, since new variations can always be presented to it.
  • Scores below 80 introduce risk and below 50 are indicative of an unreliable model. An unreliable model is, in my opinion, not fit for purpose (since you can’t trust the data). The solution here is to train your model using your test documents (by uploading them to the model and re-training it). This will teach your model to recognise such variations in future tests.
  • You can use the confidence score in applications (such as Flows or Apps) as part of your validation logic! (more on that in part 2).

Publication

So, let’s assume you’ve run some Quick tests, and you’re model is identifying values for selected fields with a confidence score that is within the threshold set in your acceptance criteria. Great! You can now publish it by selecting Publish.

Your model is ready for use.


Now that we’ve created an AI Builder model to extract selected fields from our statement of work documents, we can develop a Power Automate solution to extract and set the metadata for these documents when they get uploaded to SharePoint Online!

Head on over to part two where we’ll cover the setup and configuration of the workflow (Power Automate) component of the solution.

The Yammer Roast

The Yammer Roast

Taking my inspiration from Comedy Central, the Yammer Roast is a forum in which we can directly address resistances around Yammer, its role, and past failures in retrospect.

Some of my clients have tried with Yammer and concluded that for various reasons it’s failed to take hold. For some the value is clear and it’s a case of putting a compelling approach and supporting rationale to sponsors and consumers who remain sceptical. For others, they are looking for a way to make it work in their current collaboration landscape.

The Yammer roast is designed to tease out, recognise and address key resistances. It’s not an opportunity to blindly evangelise Yammer; it’s an exercise in consulting to provide some clarification around Yammer as a business solution, and what’s needed for a successful implementation.

In this article, I’ll cover some of the popular resistances aired at Yammer Roasts, why these resistances exist and how you can address them. If you’re an advocate for social networking in your own organisation, my hope is that this can inform your own discussion.

  1. We have concerns over impropriate usage, distraction from proper work

There’s a perception that Yammer is a form of distraction and employees will be off posting nonsense on Yammer instead of doing proper work. Even worse, they may be conducting themselves inappropriately.

A self-sustaining Yammer network has to find that balance between [non-work stuff] and [work stuff], and it needs an element of both to be successful. Informal, social contributions beget more meaningful work contributions.

Consider what is perceived as informal, non-work-stuff to be valuable. That person who just posted a cat picture? They are adopting your platform. As are those people who liked or commented on it. Consider the value to the organisation if people are connecting with each other and forming new relationships, outside of the confines of an organisational hierarchy.

Assume your employees know how to conduct themselves and can practice good netiquette. They signed a contract of employment which includes clauses pertaining to code of conduct. Perhaps refer to that in your terms of use.

Establish a core principal that no contribution should be discouraged. It really doesn’t matter where content in Yammer is generated, and any one person’s view of the content is informed by who they follow, the groups they subscribe to and the popularity of content. Uninteresting, irrelevant content is quickly hidden over time.  “But what if someone puts a cat picture in the All Company feed?” So what? What if the CEO likes it? Consider creating a foundational set of groups to ensure that on day one there’s more than just the All Company feed.

Strike that balance between work-stuff and non-work stuff.  Set an objective for your community manager (yes, a formal responsibility!) to help combat potential stage fright; there are numerous incentives and initiatives that can come into play here.  Accept the fact that your social network will, and should, grow organically.

  1. We’ve got Yammer and no-one is using it.

…but your partners and vendors are and they’re looking to collaborate with you.

Stagnant networks; a common scenario. Your organisation may be looking at alternative platforms as a way to reset/relaunch. Here, you lament the lack of tangible, measurable business outcomes at the outset of the initial rollout or the lack of investment in change management activities to help drive adoption of the platform.

You’ll smile and nod sagely, and perhaps talk to a view similar to the following:

But, for whatever reason, you’re here. So how can past experiences inform future activities?

Whether you use Yammer or not, the success of your social network in its infancy is dependent on measurable business outcomes. Without the right supporting campaign, a way to track adoption and a way to draw insight from usage, you effectively roll the dice with simply ‘turning it on’. Initiatives around Yammer can start small with a goal of communicating the success (of a process) and subsequently widening its application within your business.

Simply swapping out the technology without thinking about the business outcome may renew interest from sponsors who’ve lost faith in the current product, but you risk a rinse and repeat scenario.

“But we’re dependent on executive sponsorship!” I hear you lament. This is a by-product of early boilerplate change campaigns, where success somehow rested on executives jumping in to lead by example. Don’t get me wrong, it’s great when this happens. From my perspective, you need any group within your business with a value use case and the willingness to try. You have O365, the technology is there.

You can consider the Yammer client to not just be a portal into your network, but the networks of your vendors and partners. Access to your partner/vendor product teams (via Yammer External Networks) and being able to leverage subject matter expertise from them and the wider community is a compelling case in the absence of an internal use case.

Combatting any negative perceptions of your social network following a failure to launch is all about your messaging, and putting Yammer’s capability into a wider context, which leads me to…

  1. But we’re using Teams, Slack, Jabber, Facebook for Workplace (delete as appropriate)

Feature parity – it can be a head scratcher. “But we can collaborate in Skype. And Teams! And Yammer! And via text! What will our users do?” Enterprise architects will be advocating the current strategic platform in the absence of a differentiator, or exception. Your managed services team will be flagging additional training needs. There will be additional overheads.

If you’re there to champion Yammer in the face of an incumbent (or competing) solution, you need to adopt the tried and tested approach which is 1. Identify the differentiator and align the new technology (i.e. Yammer) to it, 2. Quantify the investment, and 3. Outline the return on investment.

As a consultant my first conversations are always focused around the role Yammer will play in your organisation’s collaboration landscape. The objective is to ensure initial messaging about Yammer will provide the required clarity and context.

This reminds me of an engagement some time ago; an organisation with a frontline workforce off the radar forming working groups in Facebook. “We aren’t across what’s going on. We need to bring them over to Yammer.” Objective noted, but consider the fact that a) these users have established their networks and their personal brand, b) they are collaborating in the knowledge that big brother isn’t watching. Therefore, there’s no way in hell they’ll simply jump ship. The solution? What can you provide that this current solution cannot? Perhaps the commitment to listen, respond and enact change.

The modern digital workplace is about choice and choice is okay. Enable your users to make that informed decision and do what is right for their working groups.

  1. It’s another app. There’s an overhead to informing and educating our business.

Of course there is. This is more around uncertainty as to the strategy for informing and educating your business. Working out the ‘what’s in it for me?’ element.

There is a cost to getting Yammer into the hands of your workforce. For example, from a technical perspective, you need to provide everyone with the mobile app (MAM scenarios included) and help users overcome initial sign-in difficulties (MFA scenarios included). Whatever this may cost in your organisation, your business case needs to provide a justification (i.e.) return on that investment.

Campaign activities to drive adoption are dependent on the formal appointment of a Community Manager (in larger organisations), and a clear understanding around moderation. So you do need to create that service description and governance plan.

I like to paint a picture representing the end state – characteristics of a mature, self-sustaining social network. In this scenario, the Yammer icon sits next to Twitter, Instagram, Facebook on the mobile device. You’re a click away from your colleagues and their antics. You get the same dopamine rush on getting an alert. It’s click bait.  God forbid, you’re actually checking Yammer during the ad-break, or just before bed time. Hang on, your employee just pointed someone in the right direction, or answered a question. Wait a second! That’s voluntarily working outside of regular hours! Without pay!

  1. Yammer? Didn’t that die out a few years ago?

You’ve got people who remember Yammer back in the days before it was a Microsoft product. Yammer was out there. You needed a separate login for Yammer. There were collaboration features built into Microsoft’s SharePoint platform but they sucked in comparison, and rather than invest in building competitive, comparative features into their own fledgling collaboration solution, Microsoft acquired Yammer instead.

Roll out a few months, and there’s the option to swap out social/newsfeed features in SharePoint for those in Yammer, via the best possible integration at the time (which was essentially link replacement).

Today, with Office 365, there’s more integration. Yammer has supported O365 sign-in for a couple of years now. Yammer functions are popping up in other O365 workloads. A good example is the Talk about this in Yammer function in Delve, which then frames the resulting conversation from Yammer within the Delve UI: From an end user experience perspective there is little difference between Yammer now and the product it was pre-Microsoft acquisition, but the product has undergone significant changes (external groups and networks for example). Expect ongoing efforts to tighten integration with the rest of the O365 suite, understand and address the implications of cutting-off that functionality.

The Outcome

Yammer (or your social networking platform of choice) becomes successful when it demonstrates a high value role in driving your organisation’s collaborative and social culture. In terms of maturity we’re taking self-sustaining, beyond efforts to drive usage and lead by example.

Your social network is an outlet for everyone in your organisation. People new to your organisation, will see it as a reflection of your collaborative and social culture; give them a way to connect with people and immediately contribute in their own way.

It can be challenging to create such an outlet where the traditional hierarchy is flattened, where everyone has a voice (no matter who they are and where they sit within the organisation). Allowing online personalities to develop without reluctance and other constraints (“if it’s public, it’s fair game!”) will be the catalyst to generating the relationships, knowledge, insight (and resulting developments) that will improve your business.

Your Modern Collaboration Landscape

There are many ways people collaborate within your organisation. You may or may not enjoy the fruits of that collaboration. Does your current collaboration landscape cater for the wide variety of groups that form (organically or inorganically) to build relationships and develop your business?

Moving to the cloud is a catalyst for re-evaluating your collaboration solutions and their value. Platforms like Office 365 are underpinned by search/discovery tools that can traverse and help draw insight from the output of collaboration, including conversations and connections between people and information. Modern applications open up new opportunities to build working groups that include people form outside your organisation with whom you can freely, and securely share content.

I’ve been in many discussions with customers on how enabling technologies play a role in the modern collaborative landscape. Part of this discussion is about identifying the various group archetypes and how characteristics can align or differ. I’ve developed a view that forms these groups into three ‘tiers’, as follows:

Organisations should consider a solution for each tier, because there are requirements in each tier that are distinct. The challenge for an organisation (as part of a wider Digital Workplace strategy) is to:

  • Understand how existing and prospective solutions will meet collaboration requirements in each tier, and communicate that understanding.
  • Develop a platform where information managed in each tier can be shared with other tiers.

Let’s go into the three tiers in more detail.

Tier One (Intranet)

Most organisations I work with have an established Tier One business solution, like a corporate intranet. These are the first to mature. They are logically represented as hierarchy of containers (i.e. sites), with a mix of implicit and explicit access control (and associated auditing difficulties). The principal use is to store documents and host authored web content (such as news). Tier One systems are usually dependent on solutions in other tiers to facilitate (and retain) group conversations or discussions. 

  • Working groups are hierarchical and long term, based off a need to model the relationships between groups in an organisation (e.g. Payroll sits under Finance, Auditing sits under Payroll )
  • Activity here is closed and formal. Contribution is restricted to smaller groups.
  • Information is one-way and top down. Content is authored and published for group-wide or organisation-wide consumption.
  • To get things done, users will be more dependent on a Service Desk (for example: managing access control, provisioning new containers), at the cost of agility.
  • Groups are established here to work towards a business outcome or goal (deliver a project, achieve our organisations objectives for 2019).

Tier Three (Social Network):

Tier Three business solutions represent your organisation’s social network. Maturity here ranges from “We launched [insert platform here] and no-one is using it ” to “We’ve seen an explosion in adoption and it’s Giphy city . They are usually dependent on solutions in other tiers to provide capabilities such as web content/document management (case in point: O365 Groups and Yammer).

  • Tier Three groups here are flattened, and cannot by design model a hierarchy. They tend to be long term, and prone to stagnation.
  • Groups represent communities, capabilities and similar interest groups, all of which are of value to your organisation. At this point you say: “I understand how the ‘Welcome to Yammer’ group is valuable, but what about the ‘Love Island Therapy’ group?”. At this point I say: “Here you have a collection of individuals who are proactively using and adopting your platform”.
  • Unlike in the other tiers, groups here tend to have no business outcome, although they’ll have objectives to gain momentum and visibility.
  • Collaboration here is open (public) and informal, down to the #topics people discuss and the language that is used.
  • A good Tier Three solution will be fully self service, subject to a pre-defined usage policy. There should be no restrictions beyond group level moderation in terms of who can contribute. If it’s public or green it’s fair game!
  • Tier Three groups have the biggest membership, and can support thousands of members.

Tier Two (Workspaces)

Tier Two comes last, because in my experience it’s the capability that is the least developed in organisations I work with and the last to mature.

A Tier Two business solution delivers a collaborative area for teams such as working groups, committees and project teams. They will provide a combination of features inherent in Tier One and Tier Three solutions. For example, the chat/discussion capabilities of a Tier Three solution and the content management capabilities of a Tier One solution

  • Tier Two groups here are flattened, and cannot by design model a hierarchy. They tend to me short term, in place to support a timeboxed initiative or activity.
  • Groups represent working groups, committees and project teams, with a need to create content and converse. These groups are coalitions, including representation from different organisational groups that need to come together to deliver an outcome.
  • Groups work towards a business outcome, for example: develop a business case, deliver a document.
  • Collaboration here tends to be closed (restricted to a small group) and semi-formal, but it is possible for such groups to be both closed, formal and open, informal.
  • A good Tier Two solution will be fully self service, subject to a pre-defined usage policy. There should be no restrictions beyond group level moderation in terms of who can contribute.
  • Groups represent a small number of individuals, and do not grow to the size of departmental (Tier One) groups or social (Tier Three) groups.

The three-tiers view identifies the different ways collaboration happens with in your organisation. It is solution agnostic, you can advocate any technology in any tier if it meets the requirement. The view helps evaluate the diverse needs of your organisation, and determine how effective your current solutions are at meeting requirements for collaboration and information working.