Location: Hybrid (East Windsor, NJ)
Job Type: Full-Time
Salary: Commensurate with experience
Discretionary bonus eligible
Reports To: Director of Data Architecture & Engineering
HMP Global is a leading healthcare event and education company, with a dominant position in several therapeutic areas including Oncology, Psychiatry & Behavioral Health, Cardiovascular, Wound Care and Public Safety. With a mission to improve patient care, we deliver information and education to healthcare professionals through 400+ global, regional, and local events and reach over 4 million users monthly through digital networks and social channels.
Job Summary
We are hiring a Senior Data Engineer to design, implement, and secure the core data pipelines powering analytics, personalization, and operational workflows across our enterprise. You will work with a modern cloud-native stack—including Snowflake, Rivery, AWS, Azure, and DBT—to ingest and transform data from diverse systems like BlueConic, Ongage, Swoogo, NetSuite, and HubSpot.
This is a hands-on leadership role focused on live and batch ETL, reverse ETL, data contract enforcement, and the implementation of privacy, compliance, and security protocols—ensuring the organization’s data assets are accurate, governed, and protected.
Key Responsibilities
Multi-Source Data Ingestion & Cloud Integration
- Design and manage robust pipelines from:
- BlueConic, Ongage, Swoogo, NetSuite, HubSpot
- REST/GraphQL APIs, webhooks, databases, cloud storage (S3, GCS, Azure Blob)
- Use Rivery and native cloud tools (AWS Glue, Lambda, S3, Azure Data Factory, Logic Apps) for data ingestion, transformation, and orchestration.
ELT with Snowflake & DBT
- Build modular ELT pipelines in Snowflake using DBT, SQL, and Python for staging, core, and presentation layers.
- Apply Snowflake best practices for role-based access, row-level security (RLS), data masking, and cost optimization.
- Optimize pipelines for scalability, maintainability, and performance.
Real-Time Data Activation & Reverse ETL
- Create live/micro-batch pipelines using cloud triggers, APIs, and streaming inputs (e.g., AWS Kinesis, Azure Event Hubs).
- Manage reverse ETL workflows syncing Snowflake data into systems like HubSpot, NetSuite, or marketing platforms via Hightouch, Census, or Rivery.
- Collaborate with marketing, analytics, and business teams on operational data use cases.
Data Contracts, Quality, and Governance
- Define and enforce data contracts including schemas, freshness SLAs, and data ownership across teams.
- Implement data testing, anomaly detection, and schema drift handling using DBT tests, Great Expectations, and observability tools.
- Document lineage, transformation logic, and data governance metadata.
Data Security & Privacy Implementation
- Enforce data security and privacy standards across all pipelines:
- Implement PII masking, tokenization, and access control in Snowflake and cloud data services.
- Ensure alignment with HIPAA, GDPR, and company data classification policies.
- Collaborate with compliance and security teams to audit pipelines, control exposure, and support consent-based processing.
- Design secure integration patterns for sensitive domains like healthcare and financials.
Legacy Migration & DevOps
- Lead migration of legacy ETL workflows to scalable ELT frameworks using Snowflake and cloud-native tools.
- Establish CI/CD practices for data workflows using GitHub, dbt Cloud, Azure DevOps, or similar.
- Automate testing, version control, and release deployment for all pipeline changes.
Required Qualifications
- 5+ years of experience in data engineering, analytics engineering, or data platform roles.
- Deep expertise with Snowflake, including security, RBAC, data masking, and performance tuning.
- Proven use of Rivery for building both ingestion and reverse ETL pipelines.
- Experience integrating BlueConic, Ongage, Swoogo, NetSuite, HubSpot, and various APIs.
- Hands-on experience with AWS (Glue, Lambda, S3) and/or Azure (Data Factory, Synapse, Functions) for data operations.
- Strong understanding of SQL, Python, DBT, and ELT design patterns.
- Experience enforcing data contracts, testing pipelines, and building observability dashboards.
- Proven implementation of data security, privacy, and compliance controls within data systems.
Preferred Qualifications
- Industry experience in healthcare, marketing tech, or event technology.
- Familiarity with data observability and governance tools (e.g., Monte Carlo, Atlan, Datafold).
- Understanding of BI validation processes with Power BI, Tableau, or Looker.
- Exposure to data risk mitigation practices including encryption at rest/in-transit, SOC 2, and audit trail monitoring.
Please follow HMP Global on LinkedIn for news and updates