Data Engineer - Snowflake & Matillion

Udupi, India

We are looking for an experienced Data Engineer with a strong foundation in data engineering best practices and experience with modern data warehousing and ETL platforms. The ideal candidate will have at least 5-7 years of hands-on experience, preferably with Snowflake, SQL Server, Matillion ETL, Azure, and Kafka to build, optimize, and maintain data pipelines and data integration solutions. This role involves working closely with cross functional teams to support data-driven initiatives across the organization. 

Join a dynamic and growing team at CredibleCFO!

Skills Required 
Data Engineering & ETL Development
Data Warehousing & Modeling
Programming & Scripting
Cloud Platforms & System Integration

What We’re Looking For:

 1Data Engineering & ETL Development:

     - Experience designing and maintaining scalable ETL pipelines.

     - Proficient in Matillion ETL for data ingestion and transformation.

     - Hands-on experience with Snowflake and SQL Server.


 2. Programming & Scripting:

      - Strong skills in SQL and Python.

      - Familiar with JSON, YAML, and REST API integration.

      - Experience building Kafka producers and consumers.


 3. Data Warehousing & Modeling:

      - Solid understanding of data warehousing concepts.

      - Skilled in schema design and data modeling within Snowflake.

      - Performance tuning of SQL queries and stored procedures.


 4. Cloud Platforms & System Integration:

     - Experience with Azure (preferred).

     - Knowledge of platform monitoring and pipeline optimization.
     - Familiarity with marketing platforms like Braze and CRM integrations.


 5. Education & Professional Qualifications:

     - Bachelor’s degree in Computer Science, Information Technology, or related field.

     - 5+ years of professional experience in data engineering.

     - Strong analytical, problem-solving, and collaboration skills.


Responsibilities:

  1. Data Pipeline Development & Maintenance:

    - Design, build, and maintain scalable ETL pipelines using Matillion to load data into Snowflake.

    - Integrate data from diverse sources, ensuring high quality, reliability, and performance.

    - Optimize SQL queries, scripts, and stored procedures in Snowflake and SQL Server.

    - Work with tools like Kafka for streaming data and implement producers/consumers.

    - Apply data parsing best practices; build, deploy, and troubleshoot workflows.

    - Proficient in SQL, Python, JSON, YAML, and using REST APIs.

    - Experience with marketing automation platforms such as Braze is a plus.

   2. Data Warehousing & Modeling:

    - Develop and maintain data models in Snowflake for BI and analytics.

    - Collaborate with data architects and analysts to enhance warehouse schemas.

   3. Collaboration & Requirements Gathering.

    - Partner with stakeholders to gather data requirements and define technical specs.

    - Support data teams with ingestion, storage, and retrieval strategies.

   

   4. Platform Monitoring & Optimization.

    - Monitor ETL and data pipeline performance; identify and resolve bottlenecks.

    - Conduct root cause analysis to address data issues and improve system reliability.


Pay: CTC ₹20,00,000.00 - ₹22,00,000.00 per year

    (Depends upon the Experience and Skills of the Candidate.)



Schedule:

Monday to Friday

U.S. shift- Timings