Website Bank of America
Bank of America is one of the world’s leading financial institutions, serving individual consumers, small and medium-market businesses and large corporations, including banking, investment, asset management, and other financial and risk management products and services. . Series. We are committed to attracting and retaining top talent worldwide to ensure our continued success. Along with taking care of our customers, we want to make people the best place to work and create a work environment where all employees have the opportunity to achieve their goals.
We are a part of global business services, providing technology and operational capabilities to the Bank of Business (LOB) and Bank of America lines for enterprise operations.
Our employees help our clients and customers at every stage of their financial life, which means the most to them. This purpose defines and unites us. Every day, we are focused on delivering value, convenience, expertise and innovation to individuals, businesses and institutional investors serving the world.
* BA Contam is a non-subsidiary of Bank of America, which is part of Global Business Services in the bank.
The individual will be part of new team which will take care of managing Zaloni Data Lake Platform as well as lower lane Hadoop Administration as part of Data analytics platform Dev Ops Team.
As part of the DAP emerging platforms, the employee will work on capacity of Senior Developer/Architecture analyst to architect and enable new capabilities on the platform. The candidate will work on new product evaluation, certification, defining standards for tool fitment to the platform.
The successful candidate will work closely with the desk users and application stakeholders. He / She needs to be hands-on and be able to perform within a busy and high pressure environment.
- Hands on design and actual work experience in required technologies
- End to end development responsibilities
- Provide quick technology solutions
- Ability to reverse engineered the code
- Excellent problem solving skills
- Interact and collaborate with global technology teams
- Responsible to handle user requests and production issues
- Open to learn and adopt new frameworks and technologies
- Flawless and on time project delivery
- Thorough understanding and working experience in Cloudera/Horton Hadoop distribution ecosystem namely YARN, Impala, Hive, Flume, HBase, Sqoop, Apache Spark, Apache Storm, Crunch, Java, Oozie, SQOOP, Pig, Scala, Python, Kerberos/Active Directory/LDAP, etc
- TPC and any other industry emerging Big Data platform engineering and performance tools
- Good Exposure to CI/CD tools, application hosting, containerization concepts
- Exposure to Cloud computing and object storage services/platforms
- Excellent verbal and written skills, Team skills, Proficient with MS office Tools, Strong analytical and problem solving skills
- Must be a self-starter, excellent communication and interpersonal skills.
- Strong problem solving and analytical skills
- Effective verbal and written communication skills
- Visual Analytics Tools knowledge ( Tableau )
- Development of complex Tableau Reports by connecting to HDFS based tables.
- Experience in both Tableau desktop and Tableau Server for development and Publishing reports.
Experience Range: 2 to 4 Years
Education: Bachelors /Master degree in computer science or engineering
Certifications If Any: NA
Work Timings: 10:30 am –7:30 pm general shift. May include weekend support
To apply for this job please visit ghr.wd1.myworkdayjobs.com.