AWS Solution Architect
Jaipur, Rajasthan
Hakkōda
Hakkoda is a modern data consultancy, helping customers harness cloud platforms and AI capabilities for innovative results in the real world.Hakkoda, an IBM Company, is a modern data consultancy that empowers data driven organizations to realize the full value of the Snowflake Data Cloud. We provide consulting and managed services in data architecture, data engineering, analytics and data science. We are renowned for bringing our clients deep expertise, being easy to work with, and being an amazing place to work! We are looking for curious and creative individuals who want to be part of a fast-paced, dynamic environment, where everyone’s input and efforts are valued. We hire outstanding individuals and give them the opportunity to thrive in a collaborative atmosphere that values learning, growth, and hard work. Our team is distributed across North America, Latin America, India and Europe. If you have the desire to be a part of an exciting, challenging, and rapidly-growing Snowflake consulting services company, and if you are passionate about making a difference in this world, we would love to talk to you!.
As an AWS Managed Services Architect, you will play a pivotal role in architecting and optimizing the infrastructure and operations of a complex Data Lake environment for BOT clients. You’ll leverage your strong expertise with AWS services to design, implement, and maintain scalable and secure data solutions while driving best practices.You will work collaboratively with delivery teams across the U.S., Costa Rica, Portugal, and other regions, ensuring a robust and seamless Data Lake architecture. In addition, you’llproactively engage with clients to support their evolving needs, oversee critical AWS infrastructure, and guide teams toward innovative and efficient solutions.This role demands a hands-on approach, including designing solutions, troubleshooting,optimizing performance, and maintaining operational excellence.
Role Description
- AWS Data Lake Architecture: Design, build, and support scalable, high-performance architectures for complex AWS Data Lake solutions.
- AWS Services Expertise: Deploy and manage cloud-native solutions using a wide range of AWS services, including but not limited to-
- Amazon EMR (Elastic MapReduce): Optimize and maintain EMR clusters for large-scale big data processing.
- AWS Batch: Design and implement efficient workflows for batch processing workloads.
- Amazon SageMaker: Enable data science teams with scalable infrastructure for model training and deployment.
- AWS Glue: Develop ETL/ELT pipelines using Glue to ensure efficient data ingestion and transformation.
- AWS Lambda: Build serverless functions to automate processes and handle event-driven workloads.
- IAM Policies: Define and enforce fine-grained access controls to secure cloud resources and maintain governance.
- AWS IoT & Timestream: Design scalable solutions for collecting, storing, and analyzing time-series data.
- Amazon DynamoDB: Build and optimize high-performance NoSQL database solutions.
- Data Governance & Security: Implement best practices to ensure data privacy, compliance, and governance across the data architecture.
- Performance Optimization: Monitor, analyze, and tune AWS resources for performance efficiency and cost optimization.
- Develop and manage Infrastructure as Code (IaC) using AWS CloudFormation, Terraform, or equivalent tools to automate infrastructure deployment.
- Client Collaboration: Work closely with stakeholders to understand business objectives and ensure solutions align with client needs.
- Team Leadership & Mentorship: Provide technical guidance to delivery teams through design reviews, troubleshooting, and strategic planning.
- Continuous Innovation: Stay current with AWS service updates, industry trends, and emerging technologies to enhance solution delivery.
- Documentation & Knowledge Sharing: Create and maintain architecture diagrams, SOPs, and internal/external documentation to support ongoing operations and collaboration.
Qualifications
- 7+ years of hands-on experience in cloud architecture and infrastructure (preferably AWS).
- 3+ years of experience specifically in architecting and managing Data Lake or big datadata solutions on AWS.
- Bachelor’s Degree in Computer Science, Information Systems, or a related field (preferred)
- AWS Certifications such as Solutions Architect Professional or Big Data Specialty.
- Experience with Snowflake, Matillion, or Fivetran in hybrid cloud environments.
- Familiarity with Azure or GCP cloud platforms.
- Understanding of machine learning pipelines and workflows.
- Technical Skills:
- Expertise in AWS services such as EMR, Batch, SageMaker, Glue, Lambda,IAM, IoT TimeStream, DynamoDB, and more.
- Strong programming skills in Python for scripting and automation.
- Proficiency in SQL and performance tuning for data pipelines and queries.
- Experience with IaC tools like Terraform or CloudFormation.
- Knowledge of big data frameworks such as Apache Spark, Hadoop, or similar.
- Data Governance & Security:
- Proven ability to design and implement secure solutions, with strong knowledge of IAM policies and compliance standards.
- Problem-Solving:Analytical and problem-solving mindset to resolve complex technical challenges.
- Collaboration:Exceptional communication skills to engage with technical and non-technicalstakeholders.
- Ability to lead cross-functional teams and provide mentorship.
- Health Insurance- Paid leave- Technical training and certifications- Robust learning and development opportunities- Incentive- Toastmasters- Food Program- Fitness Program- Referral Bonus Program
Hakkoda is committed to fostering diversity, equity, and inclusion within our teams. A diverse workforce enhances our ability to serve clients and enriches our culture. We encourage candidates of all races, genders, sexual orientations, abilities, and experiences to apply, creating a workplace where everyone can succeed and thrive.
Ready to take your career to the next level? 🚀 💻 Apply today👇 and join a team that’s shaping the future!!
Hakkoda is an IBM subsidiary which has been acquired by IBM and will be integrated in the IBM organization. Hakkoda will be the hiring entity. By Proceeding with this application, you understand that Hakkoda will share your personal information with other IBM subsidiaries involved in your recruitment process, wherever these are located. More information on how IBM protects your personal information, including the safeguards in case of cross-border data transfer, are available here.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Architecture AWS AWS Glue Azure Big Data CloudFormation Computer Science Consulting Data governance Data pipelines DynamoDB ELT Engineering ETL FiveTran GCP Hadoop Lambda Machine Learning Matillion Model training NoSQL Pipelines Privacy Python SageMaker Security Snowflake Spark SQL Terraform
Perks/benefits: Career development Health care Salary bonus Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.