LatentView Analytics is a leading global analytics and decision sciences provider, delivering solutions that help companies drive digital transformation and use data to gain a competitive advantage. With analytics solutions that provide a 360-degree view of the digital consumer, fuel machine learning capabilities, and support artificial intelligence initiatives., LatentView Analytics enables leading global brands to predict new revenue streams, anticipate product trends and popularity, improve customer retention rates, optimize investment decisions, and turn unstructured data into valuable business assets.
Job Description:
We are seeking a highly skilled Principal Data Engineer with strong expertise in Snowflake/AWS/GCP and a proven track record of leading technical teams, delivering enterprise-grade data analytics platform, and experienced in managing customer relationships from SME Point of View. In this role you will be responsible for conducting customer workshops, discovering and documenting requirement backlog, defining architecture & proposing solutions, supporting RFPs working closely with sales & account teams.
Primary Expertise:
- Should have Architected at least one Enterprise Data Lakehouse Solution using Snowflake/AWS/GCP
- Should know how to design/architect various enterprise data analytics systems like Data Mart, Features Store, MDM, Data Catalog, Meta-data Driven Batch/Streaming Ingestion Framework, Data Quality Framework, MLOps Framework, GenAI/Agentic AI Engineering Frameworks
- Should be a very fluent communicator on Data Engineering Subject Matter
- Should be experienced in responding to DE RFPs Should be able to conduct DE workshops with customers independently and document backlogs
Job responsibility:
- Solution Architecture & Design
- Lead the end-to-end architecture and design of scalable, secure, and cost- optimized Snowflake/AWS/GCP data analytics platforms.
- Architect data engineering systems incorporating Big Data, ML Feature Stores, Data Governance, Data Quality (DQ) Frameworks, and Semantic Layers.
- Collaborate with clients, architects, and business stakeholders to define cloud-native solutions that meet business and technical requirements.
- Ensure data security, governance, and compliance are embedded into all solutions.
- Data Modelling & Implementation
- Oversee the design and implementation of data models, schemas, and integration frameworks on Snowflake.
- Drive the development of high-performance ETL/ELT pipelines using Snowflake-native features and modern orchestration tools.
- Implement and manage data engineering systems such as MDM, Data catalogs, Metadata management, and enterprise data integration.
- Ensure data quality, lineage, and cataloging are built into the engineering workflows.
- Performance Tuning & Optimization
- Able to analyze & discover opportunities to optimize Data Architecture, ETL design, data pipelines, data infrastructure on Snowflake/AWS/GCP and able to design solutions to service the discovered opportunities
- Client Workshops - Conduct client workshops to understand business and technical requirements, probe needs, and identify solution opportunities.
- Translate client discussions into product backlogs and technical initiatives that align with business priorities.
- Partner with sales, pre-sales to support GTM activities — creating solution collaterals, demos, and client presentations.
- Actively participate in RFP responses, client workshops, and technical evangelism to showcase the organization’s capabilities.
- Provide thought leadership on Snowflake/AWS/GCP implementations and emerging data engineering trends
.
Skills & Qualifications:
- 10–14 years of experience in data engineering, with 4+ years of hands-on expertise in Snowflake/AWS/GCP.
- Proven experience in data modelling, ETL/ELT pipeline development, and integration frameworks.
- Strong leadership, client interaction, and stakeholder management skills and cross-functional collaboration
- Ability to understand client needs, probe requirements, and translate them into actionable technical roadmaps.
- Hands-on experience with cloud ecosystems (AWS / Azure / GCP) integrated with Snowflake/Redshift/Big Query.
Preferred Skills:
- Exposure to pre-sales, RFP management, and GTM enablement.
- Experience with modern data lakehouse architecture such as Delta Lake, Iceberg, or Hudi.
- Familiarity with real-time streaming tools (Kafka, Kinesis, or Pub/Sub).
- Understanding of data governance, cataloging, and security frameworks (Collibra, Alation, Purview).
- Certifications in Snowflake, cloud platforms, or data engineering are a strong plus.
At LatentView Analytics, we value a diverse, inclusive workforce and provide equal employment opportunities for all applicants and employees. All qualified applicants for employment will be considered without regard to an individual's race, colour, sex, gender identity, gender expression, religion, age, national origin or ancestry, citizenship, physical or mental disability, medical condition, family care status, marital status, domestic partner status, sexual orientation, genetic information, military or veteran status, or any other basis protected by federal, state or local laws.