Go back

Data Engineer - REMOTE


Overview

Description

Systone is a digital services company that plays a pivotal role in supporting government missions and objectives through Agile Delivery Management, Human Centered Design and Software Engineering. We provide government and businesses with tools and services needed to serve citizens and customers.

Because we are passionate about empowering government and organizations with innovative technology solutions that drive efficiency and digital transformation, we made it our mission to support the government as the trusted liaison between business and technology.

Why Join Us?

Innovative Projects:

  • Work on impactful projects that improve the lives and health of citizens through technology.

Collaborative Environment:

  • Join a team that values open communication, collaboration, and a badgeless approach to working with clients and partners.

Professional Growth:

  • Take advantage of opportunities for professional development and career advancement.

Inclusive Culture:

Be part of a diverse and inclusive team that values different perspectives and experiences.

The Role

This role combines technical data engineering expertise with healthcare domain knowledge, requiring both deep technical skills in data pipeline architecture and understanding of Medicare claims processing workflows. You'll be responsible for data quality monitoring, audit trail implementation, and ensuring our systems meet the stringent compliance requirements of federal healthcare programs.

As a Data Engineer on the Professional Claims Adjudication platform, you'll collaborate with the team to architect and maintain mission-critical data pipelines that process Medicare professional claims data for CMS. You'll design robust ETL processes for claims ingestion, validation, and adjudication workflows while ensuring data integrity, security, and compliance with federal healthcare regulations. Working with a Spring Boot/Java backend and PostgreSQL database, you'll build scalable data solutions that handle high-volume claims processing with strict accuracy requirements.

You'll collaborate closely with software engineers, business analysts, and product teammates to ensure our data infrastructure supports complex adjudication business rules and integrates seamlessly with external systems like CWF (Common Working File) and MCS (Multi-Carrier System). Your work will directly impact Medicare beneficiaries by ensuring healthcare providers receive accurate and timely payments for medical services.

Why we want you:

You understand that healthcare data engineering requires both technical excellence and deep attention to detail, as your work directly impacts Medicare beneficiaries and healthcare providers. You can design scalable data architectures piece by piece while maintaining a holistic view of how claims flow through the entire adjudication system. You welcome regulatory changes and evolving business requirements as opportunities to improve the system (instead of viewing them as "rework") and never consider any data pipeline or process as "final" - you understand that healthcare systems must continuously evolve.

You understand when "good enough" really IS good enough for non-critical processes, while never compromising on data accuracy, security, or compliance requirements for claims processing. You actively contribute to a learning team culture by collaborating openly, sharing knowledge about both technical solutions and healthcare domain expertise, and offering and receiving feedback with curiosity and respect. You're passionate about building systems that have real-world impact on healthcare delivery and patient outcomes.

Qualifications:

  • BS (or higher) in Computer Science, Data Engineering, or a related field

  • Strong experience with PostgreSQL database design, optimization, and administration (including read replicas and connection pooling)

  • Experience with Java 21+ and Spring Boot ecosystem for enterprise data processing applications

  • Proficiency in advanced SQL for complex data transformations, claims validation logic, and reporting

  • Experience with database migration tools (Flyway required for this role)

  • Knowledge of AWS cloud services (S3, RDS, CloudWatch, CloudTrail, Secrets Manager)

  • Experience with enterprise logging and monitoring solutions (Splunk experience strongly preferred)

  • Understanding of healthcare data standards (X12, HL7) and HIPAA compliance requirements

  • Experience with batch processing systems and file-based data ingestion patterns

  • Knowledge of data audit trails, change tracking, and regulatory compliance logging

  • Self-motivated with strong communication skills for cross-functional collaboration

  • Passion for learning new technologies and healthcare domain concepts quickly

  • Ability to write high-quality, testable code with comprehensive error handling

  • Experience with Agile application development in regulated environments

  • Test-Driven Development mindset for data pipeline reliability and mutation testing

  • Experience with data quality validation, reconciliation processes, and error handling patterns

Nice to have:

  • Experience with CMS DMOD (Data Management and Operations Division) systems and workflows

  • Knowledge of Medicare Professional Claims adjudication workflows and business rules

  • Experience with CWF (Common Working File) and MCS (Multi-Carrier System)

  • DevOps experience with containerization (Docker) and CI/CD pipelines using GitHub Actions

  • Experience with HikariCP connection pooling and advanced database performance tuning

  • Familiarity with Gradle build systems, dependency management, and multi-module projects

  • Experience with Hibernate/JPA for complex data persistence and entity relationship mapping

  • Knowledge of data audit trails and change tracking systems

  • Understanding of AWS advanced JDBC wrappers and database failover strategies for high availability

  • Experience with JSON data processing, OpenAPI specifications, and REST API integration

  • Experience with mutation testing (PIT), code coverage analysis, and static code analysis (Sonar)

  • Background in financial or claims processing systems with regulatory compliance requirements

  • Experience with Logback, SLF4J, and structured logging patterns for enterprise applications

  • Knowledge of Connect:Direct file transfer protocols and batch processing orchestration

  • Experience with New Relic APM and application performance monitoring in production environments

Who you are;

  • Must be eligible to work in the US

  • Comfortable working remotely

  • Comfortable collaborating within a Scrum Team

  • Capable of both deciding next actions, and knowing when the decision requires sign off from management

  • Comfortable in a dynamic work environment

  • Embrace AI assistants (e.g., ChatGPT, NotebookLM) as force multipliers to accelerate innovation, iterate faster, and maintain high-quality output.

  • A good communicator with the ability to express and share ideas to business leaders with a non-technical background

  • Ability to work collaboratively within a group, actively network with others and provide varying feedback to the team at the appropriate time to ensure that decisions stick.

  • Ability to identify and communicate constraints and can work within them.

Why work with us?

  • Be a part of an organization that wants to make an actual difference in the society.

  • Opportunity to work from anywhere (Within Continental USA)

  • Accrued PTO + 11 federal holidays + your birthday off

  • Trainings & Certifications budget

  • Health, Dental & Vision Insurance

  • Referral Bonus

  • 401k

Compensation

We have a competitive compensation model that compares very well against the market rates. And we ensure everyone at Systone— regardless of race, ethnicity, gender, sexual orientation, disability, religion, age, nationality, or negotiation skills — is given equal pay for equal work.

Share this job

LinkedIn
Data Engineer - REMOTE | Systone Iterations