Data Engineer

Bengaluru, KA, India

REDICA Systems

Redica Systems provides Quality and Regulatory Intelligence to life sciences professionals. Request a demo today.

View all jobs at REDICA Systems

Apply now Apply later

Company Description

Redica Systems is a SaaS start-up serving more than 200 customers within the life science sector, with a specific focus on Pharmaceuticals and MedTech. Our workforce is distributed globally, with headquarters in Pleasanton, CA.

Redica's data analytics platform empowers companies to improve product quality and navigate evolving regulations. Using proprietary processes, we harness one of the industry's most comprehensive datasets, sourced from hundreds of health agencies and the Freedom of Information Act.

Our customers use Redica Systems to more effectively and efficiently manage their inspection preparation, monitor their supplier quality, and perform regulatory surveillance.

More information is available at redica.com.

Job Description

We’re looking for an experienced Data Engineer to join our team as we continue to develop the first-of-its-kind quality and regulatory intelligence (QRI) platform for the life science industry.

Core Responsibilities 

  • Own code quality from the start—write maintainable, well-structured code that others can trust and build on
  • Focus on attaining expertise in one or more areas from Data engineering, Data modeling, Data testing, Data pipeline orchestration, etc. — rarely make the same technical mistake twice
  • Make steady, visible progress on tasks and proactively seek support when blocked.
  • Balance automated and manual testing to ship with confidence
  • Prioritize tasks and focus on important details
  • Demonstrate deep expertise in the company's technology stack and programming languages relevant to their team
  • Maintain a high-level understanding of how their team's technology stack interacts with other teams' systems and tools
  • Provide on-call support for their team when needed

About you

  • Tech Savvy: Actively adopt and incorporate new tools and frameworks into projects
  • Manage Complexity: Manage complexity proficiently, integrating various systems seamlessly
  • Plan and Align: Create and follow project plans, aligning tasks with organizational goals
  • Collaborate: Work effectively with team members and contribute to shared objectives
  • Manage Ambiguity: Navigate moderate ambiguity and adapt to changing requirements
  • Engaged: You share our values and possess the essential competencies needed to thrive at Redica, as outlined here: https://redica.com/about-us/careers/

Qualifications

Qualifications

  • 2-4 years of developer experience with an emphasis on code/system architecture and quality output
  • Experience designing and building data pipelines, data APIs, and ETL/ELT processes
  • Exposure to data modelling and data warehouse concepts
  • Hands-on experience in Python
  • Hands-on experience with Snowflake & Airflow is a must-have
  • Hands-on experience setting up, configuring, and maintaining SQL databases (MySQL/MariaDB, PostgreSQL)
  • Computer Science, Computer Engineering, or a similar technical degree

Bonus Points

  • Experience with DBT is a major plus
  • Experience with the data engineering stack within AWS is a major plus (S3, Lake Formation, Lambda, Fargate, Kinesis Data Streams/Data Firehose)
  • Experience with both batch and event-driven data architectures
  • Hands-on experience with NoSQL databases like DynamoDB and MongoDB 
  • Exposure to a start-up engineering environment is a plus

Additional Information

Top pharmaceutical companies, food manufacturers, medtech companies, and service firms from around the globe rely on Redica Systems to mine and process government inspection, enforcement, and registration data. This enables them to quantify risk signals from their suppliers, identify market opportunities, benchmark against peers, and prepare for the latest inspection trends. 

Our data and analytics have been cited by major media outlets including MSNBC, The Wall Street Journal (WSJ), and The Boston Globe.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow APIs Architecture AWS Computer Science Data Analytics Data pipelines Data warehouse dbt DynamoDB ELT Engineering ETL Firehose Kinesis Lake Formation Lambda MariaDB MongoDB MySQL NoSQL Pharma Pipelines PostgreSQL Python Snowflake SQL Testing

Perks/benefits: Startup environment

Region: Asia/Pacific
Country: India

More jobs like this