Data Engineer
Pune, MH, India
Arista Networks
Arista Networks was founded to pioneer and deliver software-driven cloud networking solutions for large data center storage and computing environments. Arista’s award-winning platforms, ranging in Ethernet speeds from 10 to 100 gigabits per...Company Description
Arista is excited to scale the Wi-Fi Team in the Pune Development Center to take its Cognitive Wi-Fi solution to the next level. Arista has ambitious plans to grow the Pune-based Development Center in the next couple of years and now is a great time to join the team when you can have a significant impact on the shape and direction as the office grows.
Job Description
Role & Responsibilities
As a Data Engineer, you will be a member of the Wi-Fi Data team, which is a part of the broader Software Engineering team. With increasing amounts of data being ingested into the cloud, the Wi-Fi Data team will play a crucial role in the success of Arista’s Cognitive Wi-Fi solution. Because this team is small and relatively new, there is a lot of room for growth and making an impact.
As part of the Wi-Fi Data team, you will work closely with the Data Scientists to build and maintain data and AI/ML pipelines that operate at scale. These may include anomaly detection, root cause analysis, automatic remediation and analytics use cases. You would also work on developing ELT data pipelines for extracting data from multiple Wi-Fi data sources and ingesting them into a data warehouse. In most cases you will have to work directly on the product backend to extract the data for further processing. Apart from these core responsibilities, you will also be developing and owning the CI/CD pipelines for deployment of your data pipelines. Depending on the project, you may get a chance to share your work with a larger community through talks and blog posts.
Qualifications
Requirements
Bachelor’s degree in Computer Science or a related field.
Proficient with Python or Go.
Experience working with databases (Relational and/or NoSQL).
Experience with data processing libraries, such as Apache Beam.
Experience in developing data pipelines and ELT/ETL workflows.
Hands-on experience with DevOps tools such as Jenkins, Git, Docker, Kubernetes, Ansible, and CI/CD pipelines.
Knowledge of data manipulation libraries, such as Pandas (Python), would be a plus.
Additional Information
All your information will be kept confidential according to EEO guidelines.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Ansible CI/CD Computer Science Data pipelines Data warehouse DevOps Docker ELT Engineering ETL Git Jenkins Kubernetes Machine Learning NoSQL Pandas Pipelines Python
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.