USACares Jobs

Job Information

Hewlett Packard Enterprise Company Enterprise Analytics Big Data Engineer in Bangalore, India

Enterprise Analytics Big Data EngineerBangalore, Karnataka


  • Overview

  • What you need to know about the job

  • Testimonials

  • Related Content

  • Map

At Hewlett Packard Enterprise, we bring together the brightest minds to create breakthrough technology solutions that advance the way people live and work. What sets us apart? Our people and our relentless dedication to helping our customers make their mark on the world.

We are a team of doers, dreamers and visionaries; inspired by our purpose and driven by our strategy. We live by our three values: partner, innovate and act.

Our legacy inspires us as we forge ahead, always pushing to discover what’s next. Every day is a new opportunity to advance and grow ourselves, our company and the industry. Some people call it an obsession, we call it a way of life.

What you need to know about the job

Job ID:1078310

Date Posted:3/1/2021

Primary Location:Bangalore, Karnataka

Job Category:Information Technology

Schedule:Full time

Shift:First Shift (India)

As an enterprise analytics engineer, you will be fascinated with data and analytics. You will obtain, manage, showcase, and evangelize big data solutions. You will build actionable insights to advance HPE's business goals. Besides building performant MIS, BI, AI, IT solutions to cater to many of our businesses, it is important that you innovate, propose and action opportunities towards our Everything-as-a-Service strategy. You will have a proven track record of implementing scalable data insights and managing tier 1 dataops to qualify for this role. You will be a "doer": one who is not afraid to deep dive into technology, code, and dashboards to meet tight commitments.

Key Responsibilities:

  • Design, architect, and engineer big data solutions

  • Develop a modern data analytics lakehouse

  • Propose and standardize ELT, BI, AI, governance, and analytics

  • Publish enterprise ingress, development, transform, egress, consumption guidelines

  • Instrument monitoring, self-healing, and self-describing data pipelines to meet tier 1 OLAs/OpX: i.e., own operational outcomes.

  • Implement continuous integration, continuous deployment, devops practice

  • Create, document, and manage data guidelines, governance, and lineage metrics

Additional Responsibilities

  • Demonstrate a growth mindset: collaborate with customer and action joint-outcomes with passion and penchant for success

  • Standardize devops, dataops, mlops process, technology, and architecture: propose and implement tailored solutions quickly on-demand

  • Regularly collaborate with BRMs, business stakeholders to define requirements, technology, architecture, and deliverables

  • Showcase ML, DL, AI solutions: fostering mentor-mentee trust/relationship with business partners

Desired Skills

  • Big Data (Hadoop, Spark, Python, Hive)

  • Pipeline Tools (ETL, DBT, Kafka, Flume, Nifi, Streamsets, Beam, Camel, Pandas)

  • BI Tools (Power BI, Qlik, Tableau, Dash)

  • Fullstack (React, Flutter, Node, Java, Angular)

  • AI (Jupyter, Spark, H2O, Alteryx)

  • Data Science (Python/R, Stats, Machine Learning, Deep Learning)

  • Design for Deployment (Docker, Kubernetes, Airflow)

  • DevOps (JIRA, Confluence, Github, Jenkins)


  • 5+ years of data management and BI experience

  • Hands-on data warehousing, data lake, data marts, analytics experience

  • BS in Computer Science or Information Systems

  • 3+ years of experience with big data and BI

  • 1+ years of SDLC/PDLC responsibilities

  • Good communication skills; articulate concepts with clarity and confidence

  • Experience working with an agile methodology (specifically, Scrum & SAFE framework)



HPE is an equal opportunity employer/Female/Minority/Individual with Disabilities/Protected Veteran Status