Data Engineer

Job LocationUS Remote
Job TagData Analytics


US Remote


About Knotch

As the global leader in Content Intelligence, Knotch’s mission is to empower brands to unlock the true value of their content by using data-driven strategies. With the Knotch Content Intelligence Platform, companies conduct competitive research and measure the performance of their content in real-time. Through our unique ability to provide a 360-degree view of all your content, including paid and owned, Knotch allows companies to connect content to business outcomes to enhance brand, increase ROI and build audiences.

Data Engineer – Full-Time Permanent Employee – 100% Remote – United StatesOur Data Engineering team is focused on large-scale data projects, integrating numerous sources with our real-time streaming analytics platform. Knotch’s product integrates data from various sources around the internet. To take our data practices to the next level, we’re looking for a Data Engineer to come on board and work as an Individual Contributor as part of our Data Engineering department! 


The Data Engineer Role

As Knotch’s Data Engineer, you will be responsible for solving a variety of challenges with data including: orchestrating the integration of data from internal and third-party sources and proactively solving architecture and programming challenges that are required to support the growth of our fast-growing data needs. You will work closely with and report directly to our Principal Data Engineer, Shane Jiang.

  • Collaborate with team members to collect product requirements, define successful analytics outcomes, and design data models
  • Design, build and launch new ETL/ELT processes in production.
  • Support existing processes running in production.
  • Build and maintain multi-functional relationships with various teams across the business
  • Build data expertise and own data quality for allocated areas of ownership.


  • 2+ years in the Data space as an engineer, analyst or equivalent
  • Knowledge in SQL and data modeling
  • Experience working with a large-scale Data Warehouse, preferable in a cloud environment
  • Experience in Python, Airflow and/or DBT
  • Experience in custom ETL design, implementation and maintenance. 
  • Ability to analyze data to identify deliverables, gaps and inconsistencies. 
  • Demonstrated capacity to clearly and concisely communicate amongst technical and business teams
  • Familiarity with AWS cloud services, including S3, Redshift, RDS, EMR, etc

Salary Range: $90,000 – $150,000