Skip to Content

Join our team!

We are looking for a Data Engineer to design, build, and optimize scalable data pipelines and architectures. You’ll work with dbt, Airflow, and Snowflake to deliver high-performance solutions on Google Cloud Platform. If you have strong SQL, Python, and ETL/ELT expertise and thrive in building reliable, cost-efficient data systems, join us to shape the backbone of our data infrastructure.


What is your mission? 

You will provide the best service to our partner brands by performing these tasks:

  • Design, build, and optimize ETL/ELT pipelines.
  • Develop and maintain dbt models for data transformation.
  • Create and manage Airflow/Composer DAGs for orchestration.
  • Architect and implement scalable Snowflake solutions.
  • Develop dimensional models and data marts.
  • Implement FDW connections for cross-database queries.
  • Integrate data from multiple source systems.
  • Deploy and manage data infrastructure on GCP.
  • Enhance pipeline performance, reliability, and cost-efficiency.
  • Implement and monitor data quality checks and SLAs.
  • Maintain documentation, data dictionaries, and process guides.
  • Conduct code reviews and collaborate cross-functionally.


Who are we looking for?

  • Bachelor’s degree in Computer Science, IT, or related field.
  • 3–7 years of professional experience as a Data Engineer.
  • Proven experience building and maintaining scalable data pipelines.
  • Hands-on experience with data warehouse architecture and implementation.
  • Strong experience with dbt for data transformation and modeling.
  • Expertise in Snowflake and Google Cloud Platform (GCP).
  • Advanced SQL skills (complex queries, optimization, indexing, window functions).
  • Proficiency in PostgreSQL and federated queries (postgres_fdw).
  • Strong Python skills (Pandas, NumPy, SQLAlchemy) and Bash scripting.
  • Experience with REST API integration and Git version control.
  • Familiarity with CI/CD for data pipelines.
  • Knowledge of ETL/ELT design, CDC, and handling structured/semi-structured data.
  • Experience with orchestration tools (Airflow, Cloud Composer).
  • Understanding of data governance, quality frameworks, and security standards.
  • Certifications (Snowflake SnowPro, GCP Data Engineer, dbt Analytics) are a plus.

Job Site Banner

Company Perks

Free learning and development courses for your personal and career growth

Comprehensive HMO benefits and insurance since day 1

Dynamic company events

Above-industry salary package and incentives

Opportunities for promotion

Free meals and snacks

Our Values

Worldwide, strongly uphold our values to be of service to our people, our clients, and our community.

WE PUT PEOPLE FIRST

We consider our people as the foundation of our success.

WE STRIVE FOR EXCELLENCE

Our commitment to quality ensures that we always do our best.

WE EMBRACE INNOVATION

We stay agile and fast, always looking for ways to solve our clients’ needs.

WE DELIVER DELIGHT

We pride ourselves on helping our clients reach their full potential.

WE CREATE REAL IMPACT

We do things right and we get the job done.