Skip to Content

Join our team!

We are looking for a highly skilled Data Engineer to design, build, and optimize large-scale data pipelines, develop dbt models, and implement Snowflake and GCP-based data architectures. You will manage orchestration using Airflow/Composer, ensure data quality, maintain documentation, and support production workloads. If you excel in modern data engineering, cloud systems, and scalable architectures, this role is for you.


What is your mission? 

You will provide the best service to our partner brands by performing these tasks:

  • Design, build, and optimize ETL/ELT data pipelines
  • Develop and maintain dbt models implementing business logic
  • Create and manage Airflow/Composer DAGs for orchestration
  • Architect and implement scalable Snowflake solutions
  • Build dimensional models and data marts
  • Implement FDW connections for cross-database querying
  • Integrate data from multiple source systems
  • Deploy and manage data infrastructure on GCP (BigQuery, Cloud Storage, Dataflow, Pub/Sub, Cloud Composer, Cloud Functions)
  • Improve pipeline performance, reliability, and cost efficiency
  • Implement and monitor data quality checks, SLAs, and anomaly detection
  • Maintain documentation, data dictionaries, and pipeline guides
  • Conduct code reviews and collaborate with cross-functional teams


Who are we looking for?

  • Bachelor’s degree in Computer Science, IT, Engineering, or a related field
  • 3–7 years of experience as a Data Engineer
  • Strong experience designing and maintaining ETL/ELT pipelines
  • Hands-on experience with Snowflakedbt, and GCP (BigQuery, Cloud Composer, Cloud Storage, Cloud Functions, Dataflow, Pub/Sub)
  • Advanced SQL skills (complex queries, optimization, window functions, CTEs)
  • Experience with PostgreSQL, postgres_fdw, MySQL, or SQL Server
  • Proficiency in Python (Pandas, NumPy, SQLAlchemy) and bash/shell scripting
  • Experience integrating data from APIs, databases, and varied file formats
  • Knowledge of dimensional modeling, data warehousing, and cloud data architecture
  • Familiarity with Git, REST APIs, CI/CD workflows, and orchestration tools
  • Nice to have: Kafka/streaming, Airbyte/Fivetran, Spark/Beam, Terraform, or data observability tools
  • Strong analytical thinking and problem-solving skills
  • Excellent communication, collaboration, and documentation ability
  • Detail-oriented, organized, and comfortable working in fast-paced environments

Job Site Banner

Company Perks

Free learning and development courses for your personal and career growth

Comprehensive HMO benefits and insurance since day 1

Dynamic company events

Above-industry salary package and incentives

Opportunities for promotion

Free meals and snacks

Our Values

Worldwide, strongly uphold our values to be of service to our people, our clients, and our community.

WE PUT PEOPLE FIRST

We consider our people as the foundation of our success.

WE STRIVE FOR EXCELLENCE

Our commitment to quality ensures that we always do our best.

WE EMBRACE INNOVATION

We stay agile and fast, always looking for ways to solve our clients’ needs.

WE DELIVER DELIGHT

We pride ourselves on helping our clients reach their full potential.

WE CREATE REAL IMPACT

We do things right and we get the job done.