Watch Now

Thursday, November 21st, 2019

11:00 AM PT

Building Robust and Scalable Data Pipelines for Snowflake


Why should you attend?

Loading data from disparate sources - structured, and unstructured, cloud service APIs, on-prem databases, streaming, or devices (IoT) is a critical step in building your Data Warehouse.

Delivering high quality data requires designing robust data pipelines that are resilient to errors, adapts to schema changes, provides flexibility to transform data and nested structures, auto-scales to meet peak loads, and provides complete transparency on status & health.

Join us in this Product Hour to learn how you can use Workato to set up data pipelines for managing initial loads, change data capture, handle exceptions, and design recovery from failures with no data loss.

What will you learn?

  • Design data pipelines for initial load/migration
  • Setup data pipelines for change data capture (CDC)
  • Templates for job, error handling, and DQ checks
  • Automate recovery, and reruns
  • Reduce operational footprint with automation


Product Marketing

Kyle Tan

Register for the Webinar now!

Can't make it, sign up anyway!
We'll send you the recording.


Jayesh Shah

VP of Customer Success