Benture logo
 ←  next job →

Senior Data Engineer / Analytics Engineer at Mercor

posted 17 hours ago
mercor.com Contractor remote in India $35-70/hr 42 views

Senior Data/Analytics Engineer | $35–70/hr | Remote in India | 20–40 hrs/week

Mercor is partnering with a cutting-edge AI research lab to hire a Senior Data/Analytics Engineer with deep expertise in DBT and Snowflake's Cortex CLI. Build and scale Snowflake-native data and ML pipelines, leveraging Cortex's emerging AI/ML capabilities while maintaining production-grade DBT transformations. Work closely with data engineering, analytics, and ML teams to prototype, operationalize, and optimize AI-driven workflows—defining best practices for Snowflake-native feature engineering and model lifecycle management. This is a high-impact role within a modern, fully cloud-native data stack.

Responsibilities

  • Design, build, and maintain DBT models, macros, and tests following modular data modeling and semantic best practices
  • Integrate DBT workflows with Snowflake Cortex CLI, enabling feature engineering pipelines, model training & inference tasks, automated pipeline orchestration, and monitoring/evaluation of Cortex-driven ML models
  • Establish best practices for DBT–Cortex architecture and usage patterns
  • Collaborate with data scientists and ML engineers to productionize Cortex workloads in Snowflake
  • Build and optimize CI/CD pipelines for DBT (GitHub Actions, GitLab, Azure DevOps)
  • Tune Snowflake compute and queries for performance and cost efficiency
  • Troubleshoot issues across DBT artifacts, Snowflake objects, lineage, and data quality
  • Provide guidance on DBT project governance, structure, documentation, and testing frameworks

Required Qualifications

  • 3+ years experience with DBT Core or DBT Cloud, including macros, packages, testing, and deployments
  • Strong expertise with Snowflake (warehouses, tasks, streams, materialized views, performance tuning)
  • Hands-on experience with Snowflake Cortex CLI, or strong ability to learn it quickly
  • Strong SQL skills; working familiarity with Python for scripting and DBT automation
  • Experience integrating DBT with orchestration tools (Airflow, Dagster, Prefect, etc.)
  • Solid understanding of modern data engineering, ELT patterns, and version-controlled analytics development

Nice-to-Have Skills

  • Prior experience operationalizing ML workflows inside Snowflake
  • Familiarity with Snowpark, Python UDFs/UDTFs
  • Experience building semantic layers using DBT metrics
  • Knowledge of MLOps / DataOps best practices
  • Exposure to LLM workflows, vector search, and unstructured data pipelines

Why Join

  • Work as an hourly contractor through Mercor with 20–40 hours per week flexibility
  • Direct opportunity to build next-generation Snowflake AI/ML systems with Cortex
  • High-impact ownership of DBT and Snowflake architecture across production pipelines
  • Work alongside top-tier ML engineers, data scientists, and research teams
  • Fully remote, high-autonomy environment focused on innovation, velocity, and engineering excellence

Benture is an independent job board and is not affiliated with or employed by Mercor.

Tips for Applying to Mercor Jobs from Benture

Increase your chances of success!
1
Four Simple Steps

Upload resumeAI interviewComplete formSubmit application

2
Perfect Your Resume

Upload your best, up-to-date resume in English. Mercor will extract details and fill out your profile automatically. Review and adjust as needed.

3
Complete = Win

SHOCKING FACT: Only ~20% of applicants complete their application! Take the 15-minute AI interview about your experience and you'll have MUCH HIGHER chances of getting hired!

AI Interview Tips: The interview focuses on your resume and work experience. Be ready to discuss specific projects and how you solved challenges.

Takes about 15 minutes | Dramatically improves your chances

Related Jobs

Benture logo
See All Jobs