For the first challenge, we are building an AI model which will detect if you are wearing a facemask. We will implement the AI model in an app.
The Power Platform enables everyone to streamline whatever process they are dealing with. Ideally, you want to automate the entire process. With power Automate and Power Apps you can do a huge amount of automation, but some processes need some cognitive skills. Traditionally this would mean we need human input, but nowadays we can utilize the power of AI. The Power Platform comes with AI Builder, which enables you to infuse AI into your solution. Simply put, it is the power of Azure Cognitive Services in a low-code form.
AI Builder comes with some pre-built AI models that you can directly use, but there is not a pre-built model that can check if you are wearing a face mask. If you are interested in those, Microsoft created some AI Builder hands-on labs that you can check out. But for this challenge, we need to build a model ourselves.
Developing an AI model means training and testing it. This used to be a time-consuming task that required quite some developer skills. But in the low-code heaven, it is as easy as creating a TikTok video (I think…). There is a separate app available called Lobe that helps you build the model. Once you have created the model, you can export it to your Power Platform environment. That is exactly what we are going to do. Follow the steps below to build the AI model.
Download Lobe from their website.
Open the app once it's downloaded.
Create New Project.
Name the project Face Mask.
Name the label No Mask.
Take some selfies (5 minimum) while not wearing a face mask. PRO TIP Keep the button pressed for bursting and move your head. The more images from multiple angles, the better the prediction will be.
Select the Label and type Mask. A new label will be made.
Grab a face mask and put it on.
Again, take some selfies. This time you can skip the duckface.
Notice that while adding pictures Lobe starts training the model. Once it’s done training you have created an AI model.
It wasn’t that hard right? You could now export the model to your Power Platform environment and start using it. But before we do that, we will test if it is doing what we want it to do. In Lobe, select Use. See the live prediction from your model. A cool feature in Lobe is that you can adjust your model while testing it. If you see some errors, you can correct them directly. Lobe will instantly train the model again with your new input. Making some adjustments is what we are going to do.
Put on your mask. Make sure it is not covering your nose.
See the prediction from your model. Probably, it is thinking you are wearing a face mask. Click on the prediction and select the label No Mask.
Repeat this a few times and let Lobe train the model again.
Now cover your nose again with the face mask. If there is a wrong prediction, correct it.
Switch between covering and not covering your nose and make changes until the model is good enough.
That’s it! You have just created an AI model that not only detects if someone is wearing a face mask, but also predicts if it is worn properly. Now, let’s make sure we can use this model in an app.
on the Use tab in Lobe, select Export.
From all the options, select Power Platform.
Sign in with your Microsoft Developer Program account.
Name your model Face Mask and select the default environment.
Now you can export your model. It is recommended to optimize your model. This will take less than a minute.
Once the model is exported, you can view your model.
AI Builder is a premium feature. Start a trial for AI Builder.
You are now ready to use your model in a flow or app. Feel free to build whatever you want with it. If you have a cool suggestion on how this model can be used or got inspired to build another model and use it for some specific scenario, please share it in the comments. For those who want to know how to build an AI Builder enables app, follow the steps below.
Create a new Canvas app
On the Data tab, click add data and search for Face Mask in the AI Builder section
Add a camera control
Set the stream rate to 100
Add a Button
Put the snippet below to in the OnSelect.
Add a label with varFaceMaskOutput as text
Select the Button
Set(varFaceMaskOutput, 'Face Mask'.Predict(Camera1.Stream).Prediction)
Your app will now create an image and send it to the server to be analyzed. The label will show you the prediction. You can obviously make functionality in your app depending on the prediction. The technology is there. It is up to your creativity to implement it usefully.
You can Download my example solution.