Senior Data Infrastructure Engineer
Dublin, Ireland
Intercom
Intercom is the complete AI-first customer service solution with ticketing, inbox, phone, help center and the most advanced AI agent on the market that works with any platform.Intercom was founded in 2011 to change the standard of customer service online. Our AI-first customer service platform is a totally new way to deliver customer service and is designed to transform the way businesses interact with their customers through AI. We all know that customer service on the internet sucks. It’s slow and impersonal. We help businesses provide instant and exceptional service to their customers and maximize their support agents’ productivity, efficiency, and performance—all through our single AI system. More than 25,000 businesses use Intercom to send millions of messages to millions of customers each month.Intercom has been a long-standing product leader and cultural icon in the technology and startup worlds for more than a decade. We set the pace for our industry and live by our values that allow us to push boundaries, build with speed and intensity, and deliver incredible value to our customers.Join us on our mission to redefine customer service and make internet business personal.
What's the opportunity?
The Data Infrastructure team builds distributed systems and tools supporting Intercom and empowering people with information. As the company grows, so does the volume and velocity of our data and the appetite for more-and-more sophisticated and specialized, often AI-assisted, data solutions.
Our team builds, maintains, evolves, and extends the data platform, enabling our partners to self-serve by creating their own end-to-end data workflows, from ingestion through transforming data and evaluating experiments to analyzing usage and running predictive models. We provide a solid data foundation to support various highly impactful business and product-focused projects.
We’re looking for a Senior Data Infrastructure engineer to join us and collaborate on large-scale data-related infrastructure initiatives, who is passionate about providing solid foundations for providing high quality data to our consumers.
What will I be doing?
- Evolve the Data Platform by designing and building the next generation of the stack.
- Develop, run and support our batch and real-time data pipelines using tools like Airflow, PlanetScale, Kinesis, Snowflake, Tableau, all in AWS.
- Collaborate with product managers, data engineers, analysts and data scientists to develop tooling and infrastructure to support their needs.
- Develop automation and tooling to support the creation and discovery of high quality analytics data in an environment where dozens of changes can be shipped daily.
- Implement systems to monitor our infrastructure, detect and surface data quality issues.
Recent projects the team has delivered:
- Refactoring of our MySQL Ingestion pipeline for reduced latency and 10x scalability.
- Redshift -> Snowflake migration
- Unified Local Analytics Development Environment for Airflow and DBT
- Building our next generation company metrics framework, adding anomaly detection and alerting, and enabling easier discovery and consumption.
About you
- You have 5+ years of full-time, professional work experience in the data space using Python and SQL.
- You have solid experience building and running data pipelines for large and complex datasets including handling dependencies.
- You have hands-on cloud provider experience (preferably AWS) including service integrations and automation via CLI and APIs.
- You have a solid understanding of data security practices and are passionate about privacy.
- You can demonstrate the significant impact that your work has had, both on the technology side as well as with the teams you’ve been part of.
- You have a great sense of what should be worked on next and know how to break big ambiguous problems into small workable chunks.
- You love helping people grow and recognise where your mentorship might be more valuable than your direct technical contributions on a project.
- You care about your craft
In addition it would be a bonus if you have
- Worked with Apache Airflow - we use Airflow extensively to orchestrate and schedule all of our data workflows. A good understanding of the quirks of operating Airflow at scale would be helpful.
- Experience or understanding of tools and technologies included in the modern data stack ( Snowflake, DBT )
- Industry awareness of up-and-coming technologies and vendors.
Benefits
We are a well treated bunch, with awesome benefits! If there’s something important to you that’s not on this list, talk to us!
- Competitive salary and equity in a fast-growing start-up
- We serve lunch every weekday, plus a variety of snack foods and a fully stocked kitchen
- Regular compensation reviews - we reward great work!
- Pension scheme & match up to 4%
- Peace of mind with life assurance, as well as comprehensive health and dental insurance for you and your dependents
- Open vacation policy and flexible holidays so you can take time off when you need it
- Paid maternity leave, as well as 6 weeks paternity leave for fathers, to let you spend valuable time with your loved ones
- If you’re cycling, we’ve got you covered on the Cycle-to-Work Scheme. With secure bike storage too
- MacBooks are our standard, but we also offer Windows for certain roles when needed.
Policies
Intercom has a hybrid working policy. We believe that working in person helps us stay connected, collaborate easier and create a great culture while still providing flexibility to work from home. We expect employees to be in the office at least two days per week.
We have a radically open and accepting culture at Intercom. We avoid spending time on divisive subjects to foster a safe and cohesive work environment for everyone. As an organization, our policy is to not advocate on behalf of the company or our employees on any social or political topics out of our internal or external communications. We respect personal opinion and expression on these topics on personal social platforms on personal time, and do not challenge or confront anyone for their views on non-work related topics. Our goal is to focus on doing incredible work to achieve our goals and unite the company through our core values.
Intercom values diversity and is committed to a policy of Equal Employment Opportunity. Intercom will not discriminate against an applicant or employee on the basis of race, color, religion, creed, national origin, ancestry, sex, gender, age, physical or mental disability, veteran or military status, genetic information, sexual orientation, gender identity, gender expression, marital status, or any other legally recognized protected basis under federal, state, or local law.
Is this role not quite what you're looking for? Join our Talent Community to stay connected with us.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs AWS Data pipelines Data quality dbt Distributed Systems Kinesis MySQL Pipelines Privacy Python Redshift Security Snowflake SQL Tableau
Perks/benefits: Career development Competitive pay Equity / stock options Flex hours Flex vacation Health care Home office stipend Insurance Lunch / meals Parental leave Salary bonus Snacks / Drinks Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.