Sr. Data Engineer

Remote Full-time
The Senior Data Engineer is a technical leader responsible for developing, and optimizing the full data infrastructure of the GIA Enterprise Data Warehouse (EDW). This role ensures the scalability, reliability, and security of data pipelines that support critical operational and financial reporting across the organization. Key responsibilities include: • Designing and leading the development of scalable data pipelines and ETL processes using Snowflake and Matillion and/or other integration tools. • Optimizing and troubleshooting complex data workflows to ensure high performance, data integrity, and reliability. • Architecting data ingestion from diverse external sources and ensuring efficient movement from source to target systems. • Managing third-party integrations and overseeing the ingestion of data from new practice acquisitions into the EDW. • Implementing and enforcing data governance, privacy, and security best practices to protect sensitive healthcare data. • Providing technical leadership through code reviews, testing, and documentation to uphold high standards of data engineering. Education: Bachelor's degree in computer science, related IT systems program of study, or alternate program with equivalent IT related experience. Technical Experience • 8+ years of progressive experience in data engineering, with a strong focus on healthcare data systems, compliance, and interoperability. • Expert-level proficiency in Snowflake and Matillion (or other similar tools) for designing, developing, and optimizing complex data integration and transformation workflows. • Advanced SQL skills with demonstrated experience in performance tuning, data modeling (star/snowflake schemas), and relational database design. • Hands-on experience with cloud platforms (AWS, Azure, GCP), including cloud-native data services, cost optimization, and infrastructure-as-code practices. • 5+ years of Python development, including deep familiarity with libraries such as NumPy, Pandas, and Snowflake Connector for Python SDK. Skilled in building scalable data transformation pipelines using Jupyter Notebooks and Azure SDK. • Extensive experience with Git-based source control systems (GitHub, Azure DevOps, GitLab), including branching strategies, pull request reviews, and CI/CD integration. • Proficient in developing data integrations from disparate sources using VS Code and GitHub, with a focus on modular, reusable code and robust error handling. • Expert at translating source-to-target mapping (STM) documentation into robust data pipelines. • Strong knowledge of healthcare data governance, privacy, and compliance standards (HIPAA, HITECH), with experience implementing secure data access controls. • Exceptional problem-solving skills, with a track record of resolving complex data issues and optimizing performance across large-scale systems. • Strong communication and collaboration abilities, with experience working cross-functionally with data architects, analysts, and business stakeholders. Industry Specific Experience • Led the development of complex financial data integrations and schema mappings from legacy systems to modern reporting platforms. Demonstrated advanced understanding of period-based accounting data, including charge reconciliation, refunds, payments, and voids, ensuring accurate financial balances across systems. • Architected and implemented medical data integrations from disparate sources into standardized reporting schemas. Applied deep domain expertise in healthcare data, including periods of care, CPT/HCPCS codes, encounter types, service dates, insurance hierarchies (primary/secondary), payment structures, and provider/location attribution. • Designed scalable frameworks for onboarding new data sources, including acquisitions, with automated validation and transformation logic to maintain data integrity and consistency. Preferred Qualifications: • 5+ years of hands-on experience with the Snowflake cloud data platform, including performance optimization, advanced SQL features, and data sharing capabilities. • 7+ years of experience using VS Code integrated with Git-based source control systems (e.g., GitHub, Azure DevOps), including CI/CD workflows, branching strategies, and code review best practices. • 7+ years of experience developing scalable data pipelines using Python and Pandas, with expertise in loading, transforming, and validating financial and medical datasets in cloud-based relational databases. • Extensive experience working in Agile environments, contributing to sprint planning, backlog grooming, and cross-functional collaboration with product owners, data architects, and analysts. • Demonstrated ability to lead technical initiatives, mentor junior engineers, and contribute to architectural decisions and data strategy. Equipment Operated: This role routinely uses standard office equipment such as computers, phones, photocopiers, filing cabinets and fax machines. Work Environment: This job operates in professional office environments. Physical Requirements: While performing the duties of this job, the employee is occasionally required to stand; walk; sit; use hands to finger, handle, or feel objects, tools, or controls; reach with hands and arms; climb stairs; balance; stoop, kneel, crouch or crawl; talk or hear; and taste or smell. The employee must occasionally lift or move up to 30 pounds. Specific vision abilities required by the job include close vision, distance vision, color vision, peripheral vision, depth perception and the ability to adjust focus. Apply tot his job
Apply Now →

Similar Jobs

Data Engineer II

Remote

Senior Python Developer – LlamaIndex / RAG Pipeline Engineer

Remote

Privacy Analyst 2

Remote

Privacy Analyst, PIA Review

Remote

IT Security Analyst 3 - IS - Data Security - FT - Day - Remote SoCal

Remote

privacy analyst sr

Remote

Manager, Data Privacy

Remote

Cybersecurity Analyst III (Specialist, Information System Security III)

Remote

Data Privacy Engineers/Specialists

Remote

Experienced Data Engineer for Netflix's Privacy Team - Remote, Part-Time Opportunity

Remote

Cyber Security Intern 2026

Remote

**Experienced Work From Home Operations Production Coordinator – Financial Services Industry**

Remote

Principal Product Management Leader – Remote Data Platform & Analytics Experience (Enterprise‑Scale Data Infrastructure, Machine Learning & Business Intelligence Strategy)

Remote

IT Business Analyst - Public Sector

Remote

Proofreader REMOTE JOB

Remote

Experienced Customer Service Representative – Delivering Exceptional Client Experiences in a Dynamic Environment at blithequark

Remote

Construction Project Manager, Worldwide Grocery Real Estate

Remote

Experienced Remote Customer Service Representative – Delivering Exceptional Support and Enhancing Customer Experience at arenaflex

Remote

Experienced Data Entry Intern – Database Management and Content Support for Spirits Review Platform at blithequark

Remote

Experienced Remote Data Entry Specialist – Flexible Work from Home Opportunity with Competitive Hourly Rate and Professional Growth

Remote
← Back