Job Description
  • Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud
  • Should have worked on handling big data Strong communication skills experience in Agile methodologies ETL ELT skills
  • Data movement skills, Data processing skills
  • Experience in using Google Cloud Storge GCS Apache Beam Dataflow BigQuery BigTable Dataproc or any other Google Cloud Data and Analyticsl Services Mandatory
  • Experience with distributed columnar and or analytic oriented databases or distributed data processing frameworks
  • Experience with open source distributed storage and processing utilities in the Apache Hadoop family and or workflow orchestration products such as Apache Airflow
  • Background in data analytics warehousing ETL development data science or other Big Data applications is preferred
Primary Skill
  • Python
  • Hadoop
  • Java
Secondary Skill
  • GCP