Location
Job description
Role Overview (Only H1B)We are looking for a skilled Data Engineer with 08+ years of strong hands-on experience in building scalable and efficient data pipelines. The ideal candidate will have deep expertise in DBT, Snowflake, and Python, and a solid understanding of modern data engineering practices.You will be responsible for designing, developing, and optimizing data pipelines and transformation workflows to support analytics, reporting, and business intelligence needs. Key Responsibilities
Required Skills & Qualifications
Preferred Qualifications
Key Deliverables
- Design, develop, and maintain scalable data pipelines using DBT, Snowflake, and Python
- Build and manage data transformation workflows using DBT (models, macros, testing, and documentation)
- Develop efficient data ingestion and processing pipelines using Python
- Optimize Snowflake data models, queries, and performance for large-scale datasets
- Ensure data quality, reliability, and consistency through testing and validation frameworks
- Collaborate with data analysts, business teams, and stakeholders to understand data requirements
- Implement best practices for data modeling, version control, and pipeline orchestration
- Monitor and troubleshoot data pipeline issues and performance bottlenecks
- Maintain clear and structured documentation for data workflows and pipelines
Required Skills & Qualifications
- 08+ years of hands-on experience in Data Engineering
- Strong expertise in DBT (Data Build Tool)
- Solid experience with Snowflake (data warehousing, performance tuning, optimization)
- Strong expertise in Python for data processing and pipeline development
- Good understanding of SQL and data modeling concepts (star schema, dimensional modeling)
- Experience with data pipeline orchestration tools (Airflow or similar)
- Familiarity with data quality and testing frameworks
- Strong problem-solving and analytical skills
Preferred Qualifications
- Experience working in cloud environments (AWS, Azure, or GCP)
- Familiarity with CI/CD practices for data pipelines
- Exposure to real-time or streaming data pipelines is a plus
- Experience in enterprise-scale data platforms
Key Deliverables
- Robust and scalable data pipelines
- Optimized Snowflake data models and queries
- DBT-based transformation frameworks
- High-quality, reliable datasets for analytics and reporting