Warehouse Centric Process Automation

Ethan
CEO, Portable

Process automation and analytics are converging.

This article outlines the history of process automation and summarizes why warehouse centric process automation (also known as operational analytics) will power the future of business operations.

The history of process automation

Over time, process automation has gone through different stages:

  1. People doing manual work -- No tools, no written processes, just people doing manual work. If the person doing the work leaves, you start from scratch.
  2. The addition of SAAS tools to provide leverage -- Data is stored along the way, and processes are codified; however, there is still a dependency on people to push things forward through the process. Someone needs to use the tool.
  3. Robotic process automation (RPA) -- Instead of having people complete the tasks, you have a script that clicks buttons and enters data. It is great because it removes the direct dependency on the person, but whenever the buttons move, or the data entry changes, everything breaks, so you have to fix your script. Instead of having a team of people completing the process, you have a smaller, more technical team, managing the scripts.
  4. API integration with automation tools -- Application programming interfaces (APIs) exist because they are a more stable interface than directly scripting against buttons on a website. Your process can run in the background, pulling information from one tool, completing tasks, and then entering information into another tool in a stable and reliable way.

What is the downside of using automation tools for process automation?

Automation tools like iPaaS (integration platform as a service) solutions create a direct coupling between business applications and business processes. Increasing switching costs, and reducing agility. You can't swap out an application without having to create entirely new business processes.

As long as you never change your tools and applications, these tools work great for process automation. The workflows are stable, they are simple, and they are fast. You can log into an iPaaS tool, create a workflow and let it run in the background forever.

The problem is that tech stacks evolve. As companies expand, as they grow their teams, and as new solutions are introduced to the market, companies need to be able to migrate to new technologies to gain leverage and scale their business operations.

Warehouse centric process automation allows you to define your processes entirely independently from your business applications

Cloud data warehouses are scalable, fast, easy to use, and leverage a widely accessible language to define procedures (SQL). In many cases, companies have already filled a data warehouse with valuable information from their business applications and generated insights that can power process automation out of the box.

This approach is so powerful because you can define entire processes that are decoupled from the suite of applications you leverage.

Here is an example of how things work with automation tools today

Let's say you have a business process where you take the list of customers that are in the 'negotiating contract' lifecycle phase in your customer relationship management (CRM) system, and send them an email with your email service provider (ESP).

Using an automation tool, you would authenticate with your CRM system, configure the data to be extracted, set up loops to iterate through the data, map the data you need to the fields in your ESP system, configure a connection to your ESP, authenticate, and set the workflow live.

If you change your CRM system, this entire workflow breaks. If you change your ESP system, you start from scratch. Not only do you need a new connection, but the way you extract the data, the iteration process, the field mapping, all changes. Every workflow you have created is coupled with the systems that are part of it.

What does this look like with warehouse centric process automation

Process automation through a data warehouse can be thought of as three distinct and decoupled steps:

  1. Connect your source application, extract the data and load it into the warehouse
  2. Process data from its raw form to the output structure you need for your use case
  3. Extract the output data, package it, and ship it into your downstream application

The actual processing logic (step #2 above) can be defined in a way that is entirely agnostic to the source application and the downstream system. You can create entire business processes that can be reused and repurposed as you switch out and evolve your business applications.

For the CRM to ESP example workflow above, you can now create a process against the data, not the applications. As long as you have a data source that can provide a list of customers with a lifecycle phase (i.e. 'negotiating contract'), you can use this existing procedure to turn the data into the fields necessary to send an email.

If you decide to switch your CRM system, you can connect the new tool to the same process, run it in parallel to make sure everything looks correct, and then sunset your old solution. If you acquire a company with their own CRM system, you can feed this process with data from both CRMs at the same time. If you decide to switch your email platform, or use the same data to power additional messaging channels with the same upstream data (i.e. text messages as well as email) it is trivial.

This decoupling provides unbelievable flexibility, agility, and visibility into your business processes, so you can automate processes while also minimizing the switching costs inherent in upgrading your enterprise tech stack.

This warehouse based architecture offers three core benefits for enterprise agility and extensibility

  1. First, you decouple process automation from the particular applications you have in place today. Data pipelines in your data warehouse (i.e. the logic behind your process automation) can be created in a way that is agnostic to where the data comes from. This means that when you decide you want to swap out an application, you can simply connect the new tool to your warehouse. Automation tools and other point-to-point solutions couple the process logic with applications you use, so if you swap out a single application, you will have to surgically rebuild tens, hundreds, or even thousands of workflows between that system and the rest of your enterprise applications.
  2. Second, the data that is powering your process automation becomes the same data available for analytics and dashboards. You no longer have to worry about the data in your dashboards and reports not aligning with the data in your business applications. If you have a list of customers that you have synced from your CRM tool to your ESP, you can be confident that the same customers who received emails are also listed in the dashboards and reports built on top of your warehouse. Not only does this increase the integrity of your data and reduce risk of making decisions on inaccurate information, but it also saves you time. Instead of connecting to tools and defining important metrics twice (once for analytics and once for process automation), you only have to do things once.
  3. Third, and most exciting, there are no limits to what you can build. Once you get your data into your data warehouse, you no longer have restrictions on the scale of data you can process, the complexity of the workflows you can build, or the intricacies of the analytical models you can create. As you grow your capabilities and your vision, the cloud data ecosystem can unlock entire new frontiers. Data warehouses have native AI / ML capabilities, marketplaces for third party data, and sharing capabilities for second party data.

At Portable, we have invested the last year and a half architecting our tech platform to make warehouse centric process automation possible

We are a warehouse centric no-code connector platform built to empower data teams with the tools necessary to define robust, reliable and extensible workflows through their data warehouse. We handle the logic and technology that connects your business applications to your warehouse, so you can invest in building reusable processes and analytics.

Specifically, we have architected our platform and our company around two objectives:

  1. Create a development platform so efficient that we can build connectors to business applications on-demand for clients. Not only can we build no-code connectors to business applications fast, but we can also maintain them at scale (both scale of data, and scale of connectors). We used to lead business intelligence, and the integrations engineering division at LiveRamp (NYSE: RAMP), so not only have we been in the shoes of our users, but we have also maintained petabyte scale integrations powering workflows for the Fortune 500.
  2. We are the first platform architected to power both inbound and outbound connectors between your business applications and your data warehouse. Our solution delivers data into your warehouse, and also pulls data from your warehouse to sync it back into your business applications. Instead of trying to convince one vendor to add the data source you need and another vendor to add the destination, we can power both to help automate specific workflows for your team.

We love talking about new trends, technologies, and best-in-class data architecture. If you want to learn more, don't hesitate to reach out

We know the best practices, the cutting edge technologies, the consultants, and the limitations of the cloud data ecosystem. We also speak the language of process automation and business operations. If you have open questions or ideas, we are more than happy to set up time to brainstorm together. If you want a demo of our platform, or are interested in working with us, even better!

Excited to hear from you at [email protected].

  • If you represent a technology platform interested in becoming a data source, or a destination, feel free to reach out as well. All we need is your documentation and a sandbox environment, and we will take care of the connector development and maintenance.