My current client is preparing to bring the Power Platform to all 9000 employees. The initial focus would be on Power Apps and Power Automate. I expect Power Automate being broadly adopted first.
That made me think on what are some nice Power Automate tips and tricks that I learned over the past couple of years, that might help others during their first steps, and maybe even a bit further. That's why this month's challenge is more a list of helpful features, tips, and tools that I think are helpful for everyone working with Power Automate. I will list them from easy, to more advanced tips.
🎯 Jumpstart Power Automate users
🎯 Improve your flow runtime
🎯 Learn some advanced features
Dynamic content is in my opinion the single most valuable feature of Power Automate, especially for early adapters. The fact that we can use output from earlier actions in all following actions, is basically the core of Power Automate. The content show the item with a small description to help quickly inform the developer which property to pick.
When you are starting, this all works fine. But eventually, you will create more complex flows and you might have the same named property from earlier actions (e.g. an id), or even the same action twice (e.g. list items from multiple SharePoint lists). In that case, the action name is the indicator to identify which property it actually is.
If you will not rename the actions, I can guarantee you will be select the wrong ID more than you want. Online you can find a lot of suggestions on how you can improve this. Personally, I adopted the suggested naming by Hilla Mäntyomena.
Original action name | Additional information
What I like about this convention is that it enables you to quickly adjust it by just adding something to original name. In the example action Get Items, I normally just add the list name after it. That way I can quickly identify which items we are dealing with. A big advantage of keeping the original name in the action will help you tremendously understand your flow when you return to edit mode after some time.
The image below will show you the action names, but now in a more understandable way.
Automated flows are queued up. Microsoft needs to keep the service up and running, at all time. That's why you might notice that when your average runtime is increasing, your automated flow takes some time before it starts. That's why it's in everyone's favor to improve your runtime. Below I give some options on how to do this.
Power Automate runs from top to bottom, in sequence. That means that an action will only start if the previous step is finished. This is because an action obviously must be finished, before we can use it as dynamic content. But not all actions have to wait for each other. Remember that we used the Get items action for both the Books and Users in the Rename actions tips? We don't need to have all the users before we can collect the books. These actions can run in parallel, which halves the runtime for this section.
To create a parallel branch, you need to hover over the arrow above an action, and press the Add a parallel branch button. If you already have a parallel branch, you might be looking for it in the interface. You have to hover above the branching to get this option (see image). You can extend the branches by pressing the + below a branch action. In those following actions, you obviously cannot use dynamic content from another branch. This won't be show either. If you want have an action waiting for all the branches to finish, you need to press + New step.
The interface not ideal for creating these parallel branches, but it will help you when you are reconfiguring your flow. After creating a branch, you can drag an action to that branch. If it is allowed, it will just work. If your drag an action to a different branch, while a following action is depending on that output, Power Automate will throw the following error.
Very helpful if you'd ask me. You can try to reconfigure a Power Automate flow yourself to see if you can improve the runtime by using parallel branching.
Running actions in parallel will improve your runtime, but the the biggest time consumers are Apply to each actions, as these run in sequence by default. To improve this, click on the ellipsis (...) of the Apply to each action, and press settings. There you can just switch on the Concurrency Control toggle. By default, it is set to 20, but this can be maxed to 50. This means that instead of handling each row of your input one-by-one, it will now handle 20, 50, or whatever number you put in at once. This is obviously quite a significant runtime improver.
But just like it's the case with branching, you cannot just turn this on for every apply to each function. If there is some form of dependency between each action, this might not be an option. However, in my experience, many apply to each actions are just some form of bulk edit calls, that truly benefits big time from this concurrency option.
Many starting with Power Automate, including myself, find it relatively hard to write a proper query, that will directly return all the records that you actually want. I think this probably has to do with the filtering is expected to be in ODATA syntax. Many bypass this by just asking for all the records, and in a sub sequential step use a filter array action to transform it into what is actually required. There are some features and tools to ease the filtering. I have two things to share.
The default filtering is, as I've mentioned, just an OData piece of syntax for filtering. However, if you enable the Experimental Features, you will get some additional functionality in the GUI. You can enable this by pressing the gear-icon on the top-right, select View all power Automate Settings, and turn on the Experimental Features. After reloading, you will see a more user-friendly filtering mechanism implemented.
This filter builder allows you to combine multiple filters to the dataset. With multiple filters, this can still become complex to use, but in many cases, you only need 1 or two filters. A good way to get started using filter queries.
Based on this article, it shouldn't be too long before we can just enjoy this functionality without the need of enabling Experimental Features.
The filter designer, is only available for SharePoint. If you need to query Dataverse, there is a really easy tool available to do exactly this. it's called FetchXML Builder, which is part of the XrmToolBox. This is a big set of tools, developerd by the community for Dataverse. Because Dataverse is the database for Dynamics, this has been around for quite some time. That's why you can find all sorts of functional small tools here. This exactly one is developed, and updated, by Jonas Rapp.
I recommend downloading it, and unzipping it to your OneDrive, as it is just an .exe file. Once opened, open the Tool Library (1 in image below), and search for FetchXML Builder. Just press install, and it will be available in a few seconds.
Before you can use it, you need to have a working connection. This is like an auth profile in the Visual Studio Code extension we have used a few times now. You can create a new connection by clicking Connect (2) > New connection (3) > selecting a connection method. If you are using your personal credentials, I find the Connection Wizard (4) the best option.
The environment URL can be found on aka.ms/ppac by navigating to the environments, selecting the environment of choice, and right-click the anvironment URL, and copy it. Note that this will include a trailing slash (/), which can give some trouble when connection. Just remove it. The rest of the wizard is straight-forward. I like to name the connection similar to the display name of the environment. This enables me to quickly switch between environments, and even different tenants, if needed.
Now you can start using FetchXML Builder. You will directly see the Query Builder.
The top 50 is a default setting, probably to limit the returned output in case of bigger datasets. For this exercise, you can just remove it in the bottom section. You can also see an exclamation mark at the entity level. If you press it, the Quick Actions pane will change, and you will get a dropdown with all the Dataverse tables listed. You can select one you created earlier, or you can select the aaduser table. There should be at least one in there.
The nice thing of using FetchXML Builder, is that you can instantly execute your query to see the returned response. This can improve your development speed. We all this, just for the filter query option, so let's proceed. If you right-click the entity/table you just selected, you will see you can add a filter. You can select the attribute (column) the operator and the value. After you created a filter query, you can run it again.
Once the output is how you'd want it, it's time to migrate your filter query to Power Automate. Press the Convert button, Power Automate Parameters.
As you can see from the image below, the filter query is just there for you. Just click the link and it will be copied to your clipboard.
Try, Catch, Finally
Power Automate has some out of the box error handling. This is a good starting point, but in some cases, you want some more developed error handling. A common pattern for error handling is try, catch, finally. I will only discuss it briefly, as there are many others who have posted about this topic. There is even a template available.
The basic idea for this error handling is that will section your Power Automate flow into three pieces. At try, you will put the actions that you want to be executed. I catch, you will put some actions that you want executed, if en error occurred. That's why in all the posts about it, you will see they adjust the run after setting to run after failed. in the Finally section, you will put all the action that are executed, regardless of a failed of successful run.
For many cloud flows, I just stick to the out of the box functionality. For flows triggered from Power Apps, I do like to implement this pattern. The reason why I think that's a good use-case, is that you want the app users to be notified on the run status of the triggered flow. For example, you flow will collect all sort of data, populated a document with that data in it, and send it to a person. If this flow is not executed, the end-user will lose confidence in using the app.
The Try Catch Finally pattern is ideal for this scenario. You can initialize a variable, leave it empty in the try section, adjust it in the catch section, and use it in your response to Power Apps. Power Apps can pick up the variable, and show an error message, if the run was unsuccessful. You can even create multiple error messages, based on the different actions.
If you are combining Power Apps with Power Automate, I highly recommend you look into the given links above.
If you are looking for solutions for your Power Automate flows, it might be good to know that the technology is quite similar to Azure Logic Apps. Changing Power Automate to Logic Apps in your Google queries might help you find the solution to your question. You can also just use the ChatGPT connection that my timeline is getting spammed with.
UPDATE: Dennis Goedegebuure was kind enough to share a very useful additional tip after publishing, function references. It's a Microsoft Docs page on Logic Apps, but as mentioned, the technology is almost identical. This page will help you enormously with functions. It is grouped into sections, just like the functions in the editor. It tells you what it does, and it shows some examples. A very welcome addition, thanks!
👉🏻 Renaming helps you create flows that will be easy to manage in the long run
👉🏻 You should strive for a minimal runtime
👉🏻 Server side filtering > filtering in your flow
👉🏻 XrmToolBox is a great toolbox, also for creating queries
👉🏻 Extended error handling is great for Power Apps triggered flows