Senior Data Engineer – Python | PySpark | GCP

  • Company : Multicloud4u Technologies
  • Requirement Type : Full Time
  • Industry : Banking and Finance
  • Location : City: Hyderabad State: Telangana Country: India (IN)
  • Key Skills : Python, PySpark, Google Cloud Platform (GCP), BigQuery, Dataflow, Cloud Functions, Cloud Storage, Distributed Computing, ETL, ELT, Data Modeling, CI/CD, Git, Docker, Kubernetes, Airflow, Cloud Composer, DevOps, Monitoring Tools, Performance Optimization, Data Engineering, GCP Professional Data Engineer Certification.
  • Your unique referral id is
    Share via your unique link below and earn up to INR 10,000 on successful referral.
Job highlights
  • Experience in Year : 6 - 7
  • Domain Requirements : IT
  • Domain Experience : 6
  • Authorized To Work : India
Description

Job Summary:

We are actively seeking an experienced Senior Data Engineer with strong expertise in Python, PySpark, and Google Cloud Platform (GCP). The ideal candidate will have 5–8 years of relevant experience in developing and managing data pipelines, working with distributed systems, and deploying scalable data solutions. Candidates with a quick joining capability and preferred locations of Hyderabad or Pune are highly desirable.

Key Responsibilities:

  • Develop, optimize, and maintain scalable data pipelines using Python and PySpark.
  • Design and implement data processing workflows leveraging GCP services such as:
  1. BigQuery
  2. Dataflow
  3. Cloud Functions
  4. Cloud Storage
  • Ensure robust data ingestion, transformation, and loading (ETL/ELT) processes.
  • Collaborate with data scientists, analysts, and engineering teams to understand data requirements and deliver clean, structured datasets.
  • Maintain version control using Git and automate deployments through CI/CD pipelines.
  • Implement containerized solutions using Docker (Kubernetes experience is a plus).
  • Monitor, troubleshoot, and optimize system performance.
  • Follow industry best practices for data governance, security, and quality.

Required Skills:

  • 5–8 years of professional experience in Python and PySpark development.
  • Hands-on expertise with GCP data services (BigQuery, Dataflow, Cloud Functions, Cloud Storage).
  • Strong understanding of distributed computing and big data processing frameworks.
  • Experience with data modeling, ETL/ELT pipelines, and performance tuning.
  • Familiarity with Git, Docker, and CI/CD workflows.
  • Ability to work both independently and as part of a team in a fast-paced environment.
  • Strong analytical and problem-solving abilities.

Preferred Qualifications:

  • GCP Professional Data Engineer Certification.
  • Experience with workflow orchestration tools like Apache Airflow or Cloud Composer.
  • Exposure to DevOps practices and monitoring tools (e.g., Stackdriver, Prometheus, Grafana).
Contact Recruiter : [email protected] Note: This Requirment is either from the Multicloud4u Technologies or from its global partner, please contact recuiter directly for further information
Apply Quick Apply & Whatsapp

Similar jobs

Quick apply for the job

preview
preview

CAPTCHA
OR Login With
CAPTCHA
Successfully applied
You have successfully applied to 'Service Now Developer'
OK