ETL & Big Data Developer – DataStage + Hadoop

  • Company : Multicloud4u Technologies
  • Requirement Type : Full Time
  • Industry : Banking and Finance
  • Location : City: Hyderabad State: Andhra Pradesh Country: India (IN)
  • Key Skills : Big Data, Unix/Linux Shell Scripting, Data Pipeline Development, Agile/Scrum Methodology, Hadoop, Hive, IBM DataStage, ETL Development
  • Your unique referral id is
    Share via your unique link below and earn up to INR 10,000 on successful referral.
Apply Directly Apply As Partner Quick Apply & Whatsapp
Join whatsapp group

  • Experience in Year : 5 - 8
  • Domain Requirements : Banking
  • Domain Experience : 5
  • Authorized To Work : India
  • Description

    Job Overview

    We are seeking an experienced ETL & Big Data Developer with strong expertise in IBM DataStage and Hadoop ecosystem tools to join our dynamic data engineering team supporting a leading global banking client. This is an exciting opportunity to work on enterprise-scale batch data processing, data integration, and performance optimization in a hybrid cloud environment.

    Key Responsibilities

    • Design, develop, and maintain ETL pipelines using IBM DataStage.
    • Work with big data platforms including Hadoop, Hive, HDFS, and Spark.
    • Implement robust and scalable ETL workflows for large-scale batch processing.
    • Optimize ETL jobs for performance, throughput, and error handling.
    • Collaborate with data architects, analysts, and business teams to understand requirements and translate them into technical solutions.
    • Ensure seamless data flow across upstream and downstream systems.
    • Follow best practices for data governance, security, and compliance.

    Required Skills & Experience

    • 5–8 years of hands-on experience with IBM InfoSphere DataStage.
    • Proficiency in Hadoop ecosystem tools like Hive, HDFS, Spark, and Oozie.
    • Strong background in SQL, data warehousing, and data integration concepts.
    • Familiarity with Unix/Linux shell scripting and job scheduling tools.
    • Experience in performance tuning, job orchestration, and error handling mechanisms.
    • Excellent problem-solving, debugging, and communication skills.
    • Experience working in Agile/Scrum environments is a plus.

    Nice to Have

    • Exposure to cloud-based data platforms (AWS EMR, Azure HDInsight, or GCP Dataproc).
    • Knowledge of Kafka, NoSQL databases, or real-time streaming technologies.
    • Understanding of data governance frameworks and metadata management tools.
    Contact Recruiter : [email protected] Note: This Requirment is either from the Multicloud4u Technologies or from its global partner, please contact recuiter directly for further information
    Quick apply for the job

    preview
    preview

    Recommended Jobs For You