Sr. ETL Engineer

Grazitti Interactive


Date: 18 hours ago
City: Panchkula, Haryana
Contract type: Full time
Job Description

We are looking for a skilled and experienced ETL Engineer to join our growing team at Grazitti Interactive. In this role, you will be responsible for building and managing scalable data pipelines across traditional and cloud-based platforms. You will work with structured and unstructured data sources, leveraging tools such as SQL Server, Snowflake, Redshift, and BigQuery to deliver high-quality data solutions. If you have hands-on experience in Python, PySpark, and cloud platforms like AWS or GCP, along with a passion for transforming data into insights, we’d love to connect with you.

Key Skills

  • Strong experience (4–10 years) in ETL development using platforms like SQL Server, Oracle, and cloud environments like Amazon S3, Snowflake, Redshift, Data Lake, and Google BigQuery.
  • Proficient in Python, with hands-on experience creating data pipelines using APIs.
  • Solid working knowledge of PySpark for large-scale data processing.
  • Ability to output results in various formats, including JSON, data feeds, and reports.
  • Skilled in data manipulation, schema design, and transforming data across diverse sources.
  • Strong understanding of core AWS/Google Cloud Services and basic cloud architecture.
  • Capable of developing, deploying, and debugging cloud-based data assets.
  • Expert-level proficiency in SQL with a solid grasp of relational and cloud-based databases.
  • Excellent ability to understand and adapt to evolving business requirements.
  • Strong communication and collaboration skills, with experience in onsite/offshore delivery models.
  • Familiarity with Marketo, Salesforce, Google Analytics, and Adobe Analytics.
  • Working knowledge of Tableau and Power BI for data visualization and reporting.

Roles And Responsibilities

  • Design and implement robust ETL processes to ensure data integrity and accuracy across systems.
  • Develop reusable data solutions and optimize performance across traditional and cloud environments.
  • Collaborate with cross-functional teams, including data analysts, marketers, and engineers, to define data requirements and deliver insights.
  • Take ownership of end-to-end data pipelines, from requirement gathering to deployment and monitoring.
  • Ensure compliance with internal QMS and ISMS standards.
  • Proactively report any data incidents or concerns to reporting managers.
Post a CV