Data Engineer - Intermediate

SourcingXPress


Date: 1 week ago
City: Thiruvananthapuram, Kerala
Salary: ₹1,100,000 - ₹1,900,000 per year
Contract type: Full time
Company: Equifax

Website: Visit Website

Business Type: Enterprise

Company Type: Product

Business Model: B2B2C

Funding Stage: IPO/Public

Industry: Fintech

Salary Range: ₹ 11-19 Lacs PA

Job Description

Who We Are

We help people live their financial best

As a trusted global leader in data, analytics, and technology we deliver insights that power decisions to move people forward.

Our Commitments Reflect Our Values

We are nearly 15,000 bright minds determined to do the right thing and put people first as we strive to create economically healthy individuals and communities everywhere we do business.

About The Job

What you’ll do

  • With moderate supervision, manage project's progress, metadata collection, development and management.
  • Perform investigations on internal / external stakeholder queries with high level direction from the Team Leader
  • Analyze problems, identify root cause, formulate findings and observations of results, suggest resolutions and communicate to internal / external stakeholders with moderate guidance from the Team Leader.
  • Maintain current knowledge of industry regulatory requirements such as reporting mandates, concepts and procedures, compliance requirements, and regulatory framework and structure. Be able to support internal/external queries on data standards.
  • Enter/maintain information in documentation repository.
  • Follow established security protocols, identify and report potential vulnerabilities.
  • Perform intermediate level data quality checks, following established procedures.

What Experience You Need

  • BS degree in a STEM major or equivalent discipline; Master’s Degree strongly preferred
  • 4+ years of experience as a data engineer or related role
  • Cloud certification strongly preferred
  • Intermediate skills using programming languages such as Python, SQL or scripting languages
  • Basic understanding and experience with Google Cloud Platforms and an overall understanding of cloud computing concepts
  • Experience building and maintaining simple data pipelines, following guidelines, transforming and entering data into a data pipeline in order for the content to be digested and usable for future projects
  • Experience supporting the design and implementation of basic data models
  • Demonstrates proficient Git usage and contributes to team repositories

What Could Set You Apart

  • Master's Degree
  • Experience with GCP (Cloud certification strongly preferred)
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
  • Experience with data pipeline and workflow management tools: Airflow, GCP dataflow etc.
  • Experience with AI or Machine Learning
  • Experience with Data Visualization Tools such as Tableau or Looker
  • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Post a CV