Job Description

Job Title: Spark, BigData, PySpark

Total years of Experience*: 5-10+ Years
Relevant years of experience*: 4-6+ Years
Specific Work Location: Pune, Hyderabad
Mode of Interview: Telephonic/Face to Face/Skype Interview

We are looking for individuals who have the following skillset

What are we looking for?

Mandatory Skills:


  • Hands-on design and build experience in building application Spark, Big data, PySpark
  • A good understanding of DW and Ingestion concept must
  • Should have experience in delivering the project in Agile scrum

Desired skills*:

  • Need hands on exp in 3-4 skills:
    • Spark
    • PySpark
    • Scala
    • Real Time Streaming
    • Kafka
    • Python
    • Hive
    • Unix Scripting
  • Experience in Teradata , SQL server will be definite plus

Suitable candidates can share profiles with