Data Engineer
Austin, Texas, United States - Remote
PILYTIX brings Explainable Artificial Intelligence (XAI) to sales and fundraising teams with software-as-a-service products that enable them to be more effective throughout their entire sales funnel. Data Engineers will assist in the development of PILYTIX's cutting-edge zeroGā¢ platform by maintaining and enhancing the data pipelines that ingest, transform, and archive client data. They will design connections to new sources of data, optimize existing integrations, and develop scalable workflows as data management specialists. They will also build internal tools to support our team of Data Scientists and Software Engineers, powering the PILYTIX products to help our clients win more revenue, faster.
Responsibilities:
- Design workflows using SQL, internal and external APIs, and other tools for data wrangling and customized, client-focused operations
- Develop and adapt code to facilitate Master Data Management (MDM)
- Author and monitor directed acyclic graphs (DAGs) in Apache Airflow to ingest and transform data
- Build connections for new sources of data including from sales, marketing, sports and entertainment event ticketing, and social media platforms
- Maintain and improve internal Python packages to streamline ingest processes
- Work collaboratively with the app development team to add new data-driven features to our software-as-a-service product
- Work collaboratively with the data science team to enhance our AI and machine learning capabilities
- Participate in Agile / Kanban processes on a daily basis
- Execute tasks with adherence to high standards of data governance and stewardship
- Comply with change management policies and code reviews to ensure data integrity and system stability
Requirements
- BS or higher in a STEM field and 2-4 years of hands-on industry experience programming and working directly with moderate-sized datasets as an analyst, engineer, consultant, database administrator or other
- Exceptional understanding of data architecture and software engineering best practices including fundamental knowledge of object-oriented design and data structures
- 2+ years professional experience with SQL (PostgreSQL & BigQuery preferred)
- 2+ years professional experience with Python
- Proficiency using REST APIs and writing code to optimize data queries, implement ETL/ELT processes, and perform data wrangling
- Familiarity with Apache Airflow or similar data pipeline systems
- Experience working with Salesforce Sales Cloud or similar CRM systems
- Experience working with Salesforce Marketing Cloud or similar marketing and customer engagement platforms
- Proficiency with Git or other DVCS, including in a team environment
- Knowledge of Agile / Kanban processes
- Entrepreneurial spirit and highly self-motivated
Nice to Have:
- Familiarity with Google Cloud Platform (GCP) infrastructure and microservices
- Exposure to CI/CD and containerized deployment via Docker / Kubernetes
Job is based in Austin TX, but extraordinarily qualified remote candidates (willing to travel to Austin semi-regularly) may apply.
Benefits
- Competitive base salary with ability to earn bonuses
- Professional development and entrepreneurial opportunities
- Paid time off
- 401(k)
- Medical and dental plans
* Salary range is an estimate based on our AI, ML, Data Science Salary Index š°
Tags: Agile Airflow APIs Architecture BigQuery CI/CD Data governance Data management Data pipelines Docker ELT Engineering ETL GCP Git Google Cloud Kanban Kubernetes Machine Learning Microservices Pipelines PostgreSQL Python Salesforce SQL STEM
Perks/benefits: Career development Competitive pay
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.