Step Functions

Data Engineer | AWS, Python & Snowflake | Ridgefield, CT (Hybrid) | $140Kโ€“$185K

๐Ÿง  Data Engineer

Before reading further, here are three non-negotiables that the client has made absolutely clear:

This is a Direct Hire (W2) permanent position. No C2C, no 1099, no third parties.

Candidates must NOT require sponsorship now or in the future. We can only consider US Citizens or Green Card holders who can work long-term without restriction.

The role is Hybrid โ€” non-negotiable. You must be onsite 2โ€“3 days per week in Ridgefield, CT, so candidates must already live within commuting distance or be willing to relocate.
(The good news: the company offers excellent benefits and is open to relocation support or a sign-on bonus for the right hire.)

๐Ÿ“ Location: Ridgefield, Connecticut (Hybrid โ€“ 2โ€“3 days onsite per week)
๐Ÿ’ผ Openings: 2
๐Ÿข Industry: Information Technology / Life Sciences
๐ŸŽ“ Education: Bachelorโ€™s degree in Computer Science, MIS, or related field (Masterโ€™s preferred)
๐Ÿšซ Visa Sponsorship: Not available
๐Ÿšš Relocation: Available for the ideal candidate
๐Ÿ’ฐ Compensation: $140,000 โ€“ $185,000 base salary + full benefits
๐Ÿ•“ Employment Type: Full-Time | Permanent

๐ŸŒŸ The Opportunity

Step into the future with a global leader in healthcare innovation โ€” where Data and AI drive transformation and impact millions of lives.

As part of the Enterprise Data, AI & Platforms (EDP) team, youโ€™ll join a high-performing group thatโ€™s building scalable, cloud-based data ecosystems and shaping the companyโ€™s data-driven future.

This role is ideal for a hands-on Data Engineer who thrives on designing, optimizing, and maintaining robust data pipelines in the cloud, while collaborating closely with architects, scientists, and business stakeholders across the enterprise.

๐Ÿงญ Key Responsibilities

  • Design, develop, and maintain scalable ETL/ELT data pipelines and integration frameworks to enable advanced analytics and AI use cases.

  • Collaborate with data architects, modelers, and data scientists to evolve the companyโ€™s cloud-based data architecture strategy (data lakes, warehouses, streaming analytics).

  • Optimize and manage data storage solutions (e.g., S3, Snowflake, Redshift), ensuring data quality, integrity, and security.

  • Implement data validation, monitoring, and troubleshooting processes to ensure high system reliability.

  • Work cross-functionally with IT and business teams to understand data requirements and translate them into scalable solutions.

  • Document architecture, workflows, and best practices to support transparency and continuous improvement.

  • Stay current with emerging data engineering technologies, tools, and methodologies, contributing to innovation across the organization.

๐Ÿง  Core Requirements

Technical Skills

โœ… Hands-on experience with AWS data services such as Glue, Lambda, Athena, Step Functions, and Lake Formation.
โœ… Strong proficiency in Python and SQL for data manipulation and pipeline development.
โœ… Experience in data warehousing and modeling (dimensional modeling, Kimball methodology).
โœ… Familiarity with DevOps and CI/CD practices for data solutions.
โœ… Experience integrating data between applications, data warehouses, and data lakes.
โœ… Understanding of data governance, metadata management, and data quality principles.

Cloud & Platform Experience

  • Expertise in AWS, Azure, or Google Cloud Platform (GCP) โ€“ AWS preferred.

  • Knowledge of ETL/ELT tools such as Apache Airflow, dbt, Azure Data Factory, or AWS Glue.

  • Experience with Snowflake, PostgreSQL, MongoDB, or other modern database systems.

Education & Experience

๐ŸŽ“ Bachelorโ€™s degree in Computer Science, MIS, or related field
๐Ÿ’ผ 5โ€“7 years of professional experience in data engineering or data platform development
โญ AWS Solutions Architect certification is a plus

๐Ÿš€ Preferred Skills & Attributes

  • Deep knowledge of big data technologies (Spark, Hadoop, Flink) is a strong plus.

  • Proven experience troubleshooting and optimizing complex data pipelines.

  • Strong problem-solving skills and analytical mindset.

  • Excellent communication skills for collaboration across technical and non-technical teams.

  • Passion for continuous learning and data innovation.

๐Ÿ’ฐ Compensation & Benefits

๐Ÿ’ต Base Salary: $140,000 โ€“ $185,000 (commensurate with experience)
๐ŸŽฏ Bonus: Role-based variable incentive
๐Ÿ’Ž Benefits Include:

  • Comprehensive health, dental, and vision coverage

  • Paid vacation and holidays

  • 401(k) retirement plan

  • Wellness and family support programs

  • Flexible hybrid work environment

๐Ÿงฉ Candidate Snapshot

  • Experience: 5โ€“7 years in data engineering or related field

  • Key Skills: AWS Glue | Python | SQL | ETL | CI/CD | Snowflake | Data Modeling | Cloud Architecture

  • Seniority Level: Midโ€“Senior

  • Work Arrangement: 2โ€“3 days onsite in Ridgefield, CT

  • Travel: Occasional

๐Ÿš€ Ready to power the future of data-driven healthcare?
Join a global data and AI team committed to harnessing the power of cloud and analytics to drive discovery, innovation, and meaningful impact worldwide.

AWS Bedrock Developer (GenAI Engineer) | $70/hr Contract | Dallas, TX | 6-Month Project

AWS Bedrock Developer

๐Ÿ“ Location: Dallas, TX (Onsite)
๐Ÿ“… Contract Length: 6 Months
๐Ÿ’ต Pay Rate: $70 per hour
๐Ÿ’ผ Employment Type: Contract | Mid-Senior Level
๐Ÿ›‚ Visa Sponsorship: Not available
๐Ÿšš Relocation Assistance: Not available
๐Ÿ‘ค Openings: 1

About the Role

We are seeking an experienced AWS Bedrock Developer / GenAI Engineer with a strong background in cloud computing and generative AI. This role will focus on transforming Contact Center applications using GenAI and leveraging AWS Bedrock to deliver scalable, innovative solutions.

Youโ€™ll work hands-on with AWS services, Terraform, and automation frameworks, helping to design, integrate, and optimize cutting-edge GenAI solutions in a mission-critical environment.

Key Responsibilities

  • Design and implement GenAI solutions using AWS Bedrock.

  • Lead transformation of Contact Center applications with generative AI capabilities.

  • Automate server and infrastructure provisioning using Terraform.

  • Develop and optimize AWS Lambda functions, Step Functions, and SSO integrations.

  • Integrate on-premise systems with AWS Bedrock for seamless enterprise adoption.

  • Collaborate with cross-functional teams to deliver scalable and secure architectures.

Must-Have Skills

  • 10+ years of IT experience.

  • 7+ years of AWS Cloud Computing expertise.

  • Proven hands-on experience with AWS Bedrock for GenAI solutions.

  • Strong experience with Terraform for infrastructure as code.

  • Expertise in AWS Lambda, Step Functions, and SSO integrations.

  • Experience integrating on-premise systems with AWS Bedrock.

Ideal Candidate Profile

  • Bachelorโ€™s degree in Computer Science, Engineering, or related field.

  • Strong technical communicator, able to work with technical and business stakeholders.

  • Ability to design and implement scalable, secure, and innovative AI-driven solutions.

Key Skills

AWS Bedrock | GenAI | AWS Cloud | Terraform | AWS Lambda | Step Functions | SSO Integrations | Contact Center Transformation | Cloud Automation

โšก This role offers the chance to work at the forefront of Generative AI innovation in the enterprise space, shaping next-generation solutions for large-scale applications.