Lead Engineer, Big Data

AZ, United States

Apply now Apply later

JOB SUMMARY

Responsible for all the aspects of architecture, design and implementation of Data Management solution using Big Data platform on Cloudera or Hortonworks and other areas of enterprise application platforms.

KNOWLEDGE, SKILLS & ABILITIES (Generally, the occupational knowledge and specific technical and professional skills and abilities required to perform the essential duties of this job):

  • Provide leadership on choosing ideal Architecture, evaluating tools and Frameworks, define Standards & Best Practices for implementing scalable business solutions
  • Understand, articulate, interpret, and apply the principles of the defined data and analytics strategy to unique, complex business problems
  • Mentor development teams to build tools for data quality control and repeatable data tasks that will accelerate and automate data management duties.
  • Implement Batch and Real-time data ingestion/extraction processes through ETL, Streaming, API, etc., between diverse source and target systems with structured and unstructured datasets
  • Design and build data solutions with an emphasis on performance, scalability, and high reliability
  • Design analytical data models for self-service BI
  • Contribute to leading and building a team of top-performing data technology professionals
  • Help with project planning and execution
  • Analyze current business practices, processes and procedures and identify opportunities for leveraging Microsoft Azure data & analytics PaaS services.
  • Expert level experience on Azure Big Data Services (like Azure Data Factory, Azure Devops, Azure Storage/ Data Lake, Azure Databricks, etc.)
  • Expert level experience on Hadoop cluster components and services (like HDFS, YARN, ZOOKEEPER, AMBARI/CLOUDERA MANAGER, SENTRY/RANGER, KERBEROS, etc.)
  • Designing and implementing BI solutions to meet business requirements using modern BU tools (Like Power BI, Tableau, etc.)
  • Ability to lead, in solving technical issues while engaged with infrastructure and vendor support teams

JOB FUNCTION:

Responsible for all the aspects of architecture, design and implementation of Data Management solution using Big Data platform on Cloudera or Hortonworks and other areas of enterprise application platforms.

REQUIRED EDUCATION:

Bachelor's Degree

REQUIRED EXPERIENCE:

  • 8 + years of data management experience
  • Previous experience leading projects or teams
  • Experience in building stream-processing systems, using solutions such as Kafka, Storm or Spark-Streaming
  • Proven experience on Big Data tools such as, Spark, Hive, Impala, Polybase, Phoenix, Presto, Kylin, etc.
  • Experience with integration of data from multiple data sources (using ETL tool such, Talend, etc.)
  • Experience with manipulating large data sets through Big Data processing tools
  • Strong experience on Data Lake,  Data Warehouse, Data Validation & Certification, Data Quality, Metadata Management and Data Governance
  • Experience with programming language such as, PySpark/ Scala/SQL, etc.
  • Experience implementing Web application and Web Services APIs (REST/SOAP)

PREFERRED EDUCATION:

Master's Degree

PREFERRED EXPERIENCE:

Experience in the healthcare industry is preferred

 

 

To all current Molina employees: If you are interested in applying for this position, please apply through the intranet job listing.

Molina Healthcare offers a competitive benefits and compensation package. Molina Healthcare is an Equal Opportunity Employer (EOE) M/F/D/V.

 

 

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  2  1  0

Tags: APIs Architecture Azure Big Data Databricks Data governance Data management Data quality Data warehouse DevOps ETL Hadoop HDFS Kafka Power BI PySpark Scala Spark SQL Streaming Tableau Talend

Perks/benefits: Career development Competitive pay

Region: North America
Country: United States

More jobs like this