Workflow Automation Research

Methods used: Usability testing.

Project Overview


In November 2016, I was tasked with creating the UI for an in house workflow automation system. As I had never worked with workflow automation software before, I wanted to determine what aspects of existing workflow automation services users liked, disliked, or could be improved upon. In order to do this, I conducted a heuristic evaluation of four services, as well as developed a series of tasks that I wanted potential users to complete in order to determine how usable existing workflow automation systems were. In order to conduct my testing, I recruited five internal users of varying technological skill levels whom I thought were good representatives of our customer base.

Research & Development


I first started off by selecting which workflow automation applications I wanted to analyze. After looking over my choices, I settled on four applications. They were Microsoft Flow, Zapier, Scratch, and Built.io. I chose Microsoft Flow and Zapier because they presented the workflow as a linear system, and I chose Scratch and Built.io because they allow the user to generate the desired workflow through the use of dragging and dropping predefined items. As part of my heuristic evaluation, I made sure to create sample workflows in each system. While I was creating these workflows, I made sure to create workflows of varying complexity. After I completed the heuristic analysis, I went back and looked at the workflows that I had created. I then sorted them into two different categories (simple and complicated), and chose one task from each category for users to do.

Execution & Results


The users response to these tasks were mostly positive. Users enjoyed being exposed to the different types of workflow systems, and they also enjoyed that they were being consulted on a project that we will be working on in the future. They also noted that having a simple task with fairly linear instructions, and a more complicated task with conditional statements was representational of real life use among our customers. Feedback on the first system (Microsoft Flow) was quite positive, with User 2 indicating that dynamic content was easy to understand, and User 3 noting that its visual similarity to other Microsoft products made it easy to pick up and use. However, other users noted that while it was easy to use, the terminology used made some of the options unclear.

Feedback on the second system (MIT Scratch) was almost as positive as Microsoft Flow. Multiple users liked how Scratch utilized colors & shapes to alert the user to the purpose of the individual widgets, and User 1 noted that "having an ability to test your workflow in real time is great". However, more than half of the users noted that the terminology was very confusing, with User 3 noting that "this isn’t a tool for people that are unfamiliar with code, but I would use this to teach my kids how to code".

The third system that was tested (Zapier) is a linear workflow creation system, much like Microsoft Flow, and reactions to it were very similar as well. User 3 thought that Zapier had very similar usage patterns to Flow, and that Zapier was "super intuitive" to use. User 1 & User 2 liked how the system indicated the status of each major step and associated substeps. However, User 4 was not a fan of Zapier, and noted that it seemed "pretty complicated", and that there were an "almost overwhelming abundance of options". After Zapier, the last system tested was Built.io.

Built.io takes a more freeform approach to workflow assembly by letting users drag out blocks onto a canvas, and then having the user drag connecting lines between the blocks in order to set up the flow. Each block represents a different activity, such as uploading a file to Dropbox, or posting on Facebook. This is the one service that had more negative responses than positive responses. One user failed to complete both tasks, giving up halfway through the simple task. Terminology was called out by Users 1 & 3 as being very confusing, and Users 2 & 4 noted that it was initially complex. User 3 was unable to find out how to apply filters, and was unable to successfully complete the more complicated exercise.

Overall, users preferred the systems that had a more linear flow and more plainly worded language. In this case, this means that Zapier and Microsoft Flow were the winners.They also liked systems that had well defined methods of applying filters, as well as being able to test individual steps.


© 2019 Jonathan De Heus