PySpark Jobs in Capgemini in Bengaluru for 4 to 6 Years

  • Exp. Jobs
  • Full Time
  • IT/Software Jobs
  • Banglore
  • Salary: 45000-80000
  • Exp. 4-6 Years

Website Capgemini

BE/B.tech

Job Description for PySpark Jobs in Capgemini

  1. Must have experience implementing AWS Big Data Lake using EMR and Spark
  2. Experience working with Spark Hive message queue or pub-sub streaming technologies for 3 years
  3. 6 years of experience, developing data pipelines using a mix of Python Scala SQL etc. languages ​​and open-source framework for implementing data ingest processing and analytics technologies
  4. Experience getting open-source of big data processing framework like streaming technologies like Apache Spark Hadoop and Kafka
  5. Hands on experience with new technologies related to data space such as Spark Airflow Apache Druid Snowflake or any other OLAP database
  6. Experience developing and deploying data pipelines and real-time data streams within the cloud infrastructure, preferably AWS

Primary skill for PySpark Jobs in Capgemini

  1. PySpark
  2. AWS

Secondary skills

  1. Experience in using CI CD pipeline GitLab
  2. Experience in code quality implementation
  3. Used Pep8 Pylint tool or any other code quality tool
  4. Experience of Python plugins operators like FTP sensor Oracle operator etc.

Related Jobs: Off campus Drives

Tagged as:

To apply for this job please visit www.capgemini.com.

2019-2020 © All Rights Reserved by Sarkariresultinfo.net