AWS Data Engineer / Architect

722 East Market Street, Leesburg, VA, USA

Apply now Apply later

We are seeking a Sr. AWS Data Engineer / Architect with hands-on experience in designing, implementing, and managing data architectures in cloud environments, particularly AWS and Databricks. The successful candidate will play a key role in architecting complex data solutions using Terraform, AWS services, and Databricks, while mentoring a team of engineers and collaborating with cross-functional teams to deliver high-quality data solutions.

This role requires extensive knowledge of cloud architecture, data engineering, and experience in building large-scale, reliable data pipelines in cloud environments. You will work closely with stakeholders to align technical solutions with business goals and ensure best practices for data management, storage, and analytics are adhered to.

Key Responsibilities:

  • Solution Design: Design and implement scalable, reliable, and high-performance data solutions on AWS and Databricks, ensuring they align with business objectives and technical requirements.
  • Architecture Diagrams: Create detailed architecture diagrams and blueprints for data storage, networking, and cloud-based application components.
  • Data Pipeline Development: Develop end-to-end data pipelines using AWS services such as Glue, Redshift, Lambda, S3, and Databricks. Design and implement solutions for data warehousing, ETL processes, and real-time data integration.
  • Team Leadership: Lead and mentor a team of data engineers, analysts, and architects. Provide guidance on best practices and architectural decisions, while driving the effective use of Databricks and AWS services.
  • Collaboration: Work closely with data scientists, business analysts, and development teams to ensure seamless integration of data solutions into the organization's broader architecture.
  • Innovation: Lead innovation efforts by evaluating, recommending, and implementing data-centric technologies for the platform. Continuously explore new technologies, benchmarking their capabilities for potential use.
  • Documentation: Create and maintain detailed design documents, implementation plans, and architectural diagrams. Ensure that all processes and solutions are thoroughly documented for future reference and compliance.

Required Skills and Experience:

Experience:

  • 10+ years of experience in designing, architecting, and implementing analytics solutions and cloud infrastructure.
  • 6+ years of experience working on AWS environments with hands-on experience in Terraform, Glue, Lambda, S3, and solving practical issues related to their usage.
  • Strong Databricks knowledge and practical experience in architecting solutions using Databricks.
  • Extensive hands-on experience in Python/PySpark for data engineering and developing scalable applications.
  • Experience with AWS IAM, SNS, RDS, and serverless Lambda setup.
  • Experience in Data Warehousing (Redshift, Snowflake) and strong SQL expertise.
  • Proficiency in Git, including version control and code branching strategies.
  • Experience with DevOps and CI/CD tools such as Jenkins, Cloudbees, etc.
  • Healthcare knowledge or experience with Centers for Medicare and Medicaid Services (CMS) is preferred.

Technical Skills:

  • Terraform Expertise: Extensive hands-on experience with Terraform, managing version changes, and troubleshooting in complex environments.
  • Databricks Architecture: Strong architectural knowledge of Databricks, including implementing and optimizing data pipelines.
  • AWS Services: Proven experience with AWS services (EC2, S3, Lambda, RDS, Glue, etc.) in enterprise multi-account environments.
  • Data Engineering: Hands-on experience with data engineering tools and languages like Python/PySpark, Pandas, and AWS Glue.
  • Access Pattern Experience: Deep knowledge of AWS IAM for access management and practical experience with SNS and RDS.

Education:

  • Bachelor's Degree (BS/BE/BTech/MCA) or equivalent technical experience in Computer Science, Data Engineering, or a related field.

Preferred Qualifications:

  • AWS Certifications: AWS DevOps or SysOps Administrator certifications.
  • Experience with AWS FedRamp services and compliance with federal cloud regulations.
  • Hands-on experience with Snowflake or similar data warehouse platforms.
  • Experience with Jira or other Agile project management tools.

How You'll Make an Impact:

  • Lead the development of cloud-based data solutions that scale to meet the demands of large, enterprise data environments.
  • Provide technical leadership and mentorship to the data engineering team, promoting best practices for cloud data architectures and efficient data pipelines.
  • Collaborate with stakeholders to ensure alignment between technical solutions and business goals.
  • Innovate by adopting and implementing new data technologies, improving data processing efficiency, and providing valuable insights to stakeholders.

Why IT Hub Inc.?

At IT Hub Inc., you will join a forward-thinking team focused on building innovative data solutions. We value continuous learning, collaboration, and technology-driven problem-solving. We offer a dynamic work environment where your expertise will directly contribute to the success of mission-critical projects.

Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Agile Architecture AWS AWS Glue CI/CD Computer Science Databricks Data management Data pipelines Data warehouse Data Warehousing DevOps EC2 Engineering ETL Git Jenkins Jira Lambda Pandas Pipelines PySpark Python Redshift Snowflake SQL Terraform

Perks/benefits: Career development

Region: North America
Country: United States

More jobs like this