Associate Data Solutions Engineer

PA-West Chester - Dunwoody Dr, United States

Apply now Apply later

The primary purpose of this role is to work as part of a multidisciplinary Agile team to create and maintain mission-critical data management, financial and analytics systems. This involves providing strong foundational technical skills and engaging directly in backlog management, release planning, analysis, modeling, design, implementation, delivery, and support activities. The Associate Data Solutions Engineer contributes to solution delivery by acting both collaboratively and strategically. This includes engaging directly with stakeholders to understand, elaborate, and prioritize objectives within roadmap and blueprint frameworks to deliver effective, risk-managed solution releases.

We are considering candidates to work onsite at our West Chester, PA office, or possibly in our Des Moines, IA location. We are currently working in a hybrid model - in the office three days/week (Tues, Wed, Thurs) with the option to work remotely the remaining two weekdays.

  • Align technical approaches with architectural blueprints and follow approved firm practices. 
  • Collaborate with business and technical partners to define new features and system capabilities. 
  • Participate in regular design reviews, code reviews, and practice reviews. 
  • Contribute to the documentation of team patterns, standards, and technical approaches. 
  • Implement appropriate SCM, CI, CD, automated testing, and other DevSecOps practices. 
  • Participate in ongoing DevSecOps activities. 
  • Liaise with internal product teams, functional teams, and SRE teams to coordinate work. 
  • Provide team L2+ support for team products and systems. 
  • Define, design, implement, validate, and support solution elements alongside teammates. 
  • Assist the manager with ongoing operational and practice improvements. 

KNOWLEDGE, SKILLS, & ABILITIES

Technical: 

  • Bachelor’s degree in IT or related field preferred. 
  • 1+ years of professional experience designing, building, delivering, and supporting data-intensive distributed systems and/or data management systems. 
  • 1+ years of professional experience developing enterprise-class SOA/micro-service systems and/or enterprise-class data management systems with streaming data feeds. 
  • 1+ years of professional experience developing solutions with any three (or more) of the following AWS mechanisms (or non-AWS equivalents): S3, Lambda, EFS, SNS, SQS, EventBridge, Kinesis, Aurora, DynamoDB, RDS Proxy, Glue, Athena, Redshift/Spectrum. 
  • Proficiency in at least one of the following (and familiarity or better with the other) is required:  
  • Common enterprise data engineering patterns and their pertinence in various solution contexts (e.g., Relational vs Dimensional Data Marts, Data Warehousing, Schema-on-Read vs Schema-on-Write, HSM, Data Replication and Movement, micro-batch processing, etc.). 
  • Common enterprise distributed systems patterns and their pertinence in various solution contexts (e.g., SOA/micro-services, CQRS, event-stream processing, pub/sub messaging, message queues, data fabrics, etc.). 
  • Familiarity with AWS or Azure “Cloud Native” solutions concepts (aka Serverless Computing) is required, practical experience highly preferred. 
  • Familiarity with BI & Data Visualization tools (e.g., PowerBI, Tableau, or Qlik) and the attendant underlying data structures is required, practical experience preferred. 
  • Practical experience with at least one large-scale distributed data store is preferred (e.g., Cassandra, Greenplum, PostgreSQL, RedShift); AWS RedShift preferred. 
  • Practical experience with at least one MOM platform is required (e.g., MQ, RV, JMS, MSMQ, Solace, ZeroMQ, Kafka, etc.). 
  • Practical experience with at least one messaging serialization mechanism is required (e.g., Protobufs, Thrift, Avro, JSON, etc.). 
  • Proficiency with two or more of the following languages is required: C#, Python, PowerShell, SQL, JavaScript. 
  • Proficiency with Git is required; experience with GitHub enterprise preferred. 
  • Proficiency with ETL and ELT concepts and related tooling, and building home-grown solutions, is required. 
  • Understanding of OOA/OOD, and applicability to system as well as element design, is required. 
  • Proficiency with conceptual, logical, and system modeling notations and tooling is required; UML is preferred. 
  • Practical experience with CI/CD tooling is strongly preferred (e.g., Jenkins, Bamboo, TeamCity, AWS CodePipeline, DB Maestro, etc.). 
  • Practical experience with automated BDD test tooling is strongly preferred (e.g., SpecFlow, Cucumber, etc.). 
  • Any experience with Big Data technologies and patterns is a plus. 
  • Experience with Kanban, Lean, or XP practices is a plus. 

Non-technical: 

  • Demonstrates superior communication skills: writes and speaks clearly and concisely; presents ideas understandably and persuasively; uses appropriate channels for audience and subject matter; able to consume and share information at multiple levels, from detailed technical directions to high-level business concepts. 
  • Exhibits efficient and effective documentation skills recognizes and achieves “just enough” documentation, leveraging content-suitable mediums and mechanisms such as wikis, pictures, diagrams, schema, models, etc. 
  • Exercises strong analytical capabilities with the ability to ask probing questions to gather complete information. 
  • Steps back from detail to see the big picture, distinguish the vital from the trivial, ignore nonessentials, and remove obstacles so that essential things have clear, smooth passage. 
  • Remains calm and compose in adversity, lending stability and order to stressful situations. 
  • Deals easily with multiple concurrent issues and tasks under pressure. 
  • Anticipates potential obstacles and develops contingency plans to overcome them. 
  • Possesses an Agile mindset and a drive to deliver business value via relentless incrementalism. 
  • Experience in insurance, financial services, or a related field is a plus. 

#LI-SC1

Venerable Values: 

Every position at Venerable has responsibility for living out the company's values as described here:

We are Courageous - We think critically, ask "why?" and seek out creative solutions.

We are Curious  - We take calculated risks, learn from out failures, and challenge traditional ways of thinking.

We are Connected - We are connected to each other, our customers and our community.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Agile Athena Avro AWS Azure Big Data Cassandra CI/CD Data management Data visualization Data Warehousing Distributed Systems DynamoDB ELT Engineering ETL Git GitHub JavaScript Jenkins JSON Kafka Kanban Kinesis Lambda PostgreSQL Power BI Python Qlik Redshift SQL Streaming Tableau Testing

Perks/benefits: Team events

Region: North America
Country: United States

More jobs like this