Data Engineer

hackajob


Date: 2 weeks ago
City: Pune, Maharashtra
Contract type: Full time
hackajob is collaborating with Wipro to connect them with exceptional tech professionals for this role.

Title: Data Engineer

Requisition ID: 64694

City: Pune

Country/Region: IN

Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com.

Job Description

Role Purpose

The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists.

͏

Key skills - Azure Data Factory (primary) , Azure Data bricks Spark (PySpark, SQL)

Must-have skills

  • Cloud certified in one of these categories
  • Azure Data Engineer
  • Azure Data Factory , Azure Data bricks Spark (PySpark or scala), SQL, DATA Ingestion, Curation
  • Semantic Modelling/ Optimization of data model to work within Rahona
  • Experience in Azure ingestion from on-prem source, e.g. mainframe, SQL server, Oracle.
  • Experience in Sqoop / Hadoop
  • Microsoft Excel (for metadata files with requirements for ingestion)
  • Any other certificate in Azure/AWS/GCP and data engineering hands-on experience in cloud
  • Strong Programming skills with at least one of Python, Scala or Java
  • Strong SQL skills ( T-SQL or PL-SQL)
  • Data files movement via mailbox
  • Source-code versioning/promotion tools, e.g. Git/Jenkins
  • Orchestration tools, e.g. Autosys, Oozie
  • Source-code versioning with Git.

Nice-to-have skills

  • Experience working with mainframe files
  • Experience in Agile environment, JIRA/Confluence tool

͏

Handle technical escalations through effective diagnosis and troubleshooting of client queries

Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements

If unable to resolve the issues, timely escalate the issues to TA & SES

Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions

Troubleshoot all client queries in a user-friendly, courteous and professional manner

Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business

Organize ideas and effectively communicate oral messages appropriate to listeners and situations

Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s

͏

Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client

Mentor and guide Production Specialists on improving technical knowledge

Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist

Develop and conduct trainings (Triages) within products for production specialist as per target

Inform client about the triages being conducted

Undertake product trainings to stay current with product features, changes and updates

Enroll in product specific and any other trainings per client requirements/recommendations

Identify and document most common problems and recommend appropriate resolutions to the team

Update job knowledge by participating in self learning opportunities and maintaining personal networks

͏

Deliver

No Performance Parameter Measure

1 Process No. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT

2 Team Management Productivity, efficiency, absenteeism

3 Capability development Triages completed, Technical Test performance

Key skills - Azure Data Factory (primary) , Azure Data bricks Spark (PySpark, SQL)

Experience - 5 to 10 Years

Must-have skills

  • Cloud certified in one of these categories
  • Azure Data Engineer
  • Azure Data Factory , Azure Data bricks Spark (PySpark or scala), SQL, DATA Ingestion, Curation
  • Semantic Modelling/ Optimization of data model to work within Rahona
  • Experience in Azure ingestion from on-prem source, e.g. mainframe, SQL server, Oracle.
  • Experience in Sqoop / Hadoop
  • Microsoft Excel (for metadata files with requirements for ingestion)
  • Any other certificate in Azure/AWS/GCP and data engineering hands-on experience in cloud
  • Strong Programming skills with at least one of Python, Scala or Java
  • Strong SQL skills ( T-SQL or PL-SQL)
  • Data files movement via mailbox
  • Source-code versioning/promotion tools, e.g. Git/Jenkins
  • Orchestration tools, e.g. Autosys, Oozie
  • Source-code versioning with Git.

Nice-to-have skills

  • Experience working with mainframe files
  • Experience in Agile environment, JIRA/Confluence to
Post a CV