Senior Data Engineer
ZyG
Data Science
Tel Aviv District, Israel · Tel Aviv-Yafo, Israel
Posted on Mar 21, 2026
ZyG is revolutionizing eCommerce with the first Agentic Operating System for eCom scale.
This end-to-end platform helps product inventors and entrepreneurs turn their products into successful Direct-to-Consumer businesses, addressing the three key challenges of eCom scale: poor product bets, lack of scale infrastructure, and the need for often-complex financing.
With a connected system of AI Agents built on a unified data infrastructure, the OS analyzes scale-market fit, replaces the fragmented tools, agencies, and manual workflows that typically power eCom growth, and offers the financing needed for DTC scale.
The Role
We are looking for a Senior Data Engineer with a strong Backend Engineering background to architect the intelligent data foundation of ZyG.
This is not a traditional ETL role. You will build the critical infrastructure that feeds our Agentic Growth Engine, enabling AI agents to perceive the market, make decisions, and drive growth. You will own the semantic layer, design the agentic data architecture (both data retrieval and training infrastructure), and build sophisticated extraction processes that allow our system to "read" the competitive landscape.
What You'll Do
This end-to-end platform helps product inventors and entrepreneurs turn their products into successful Direct-to-Consumer businesses, addressing the three key challenges of eCom scale: poor product bets, lack of scale infrastructure, and the need for often-complex financing.
With a connected system of AI Agents built on a unified data infrastructure, the OS analyzes scale-market fit, replaces the fragmented tools, agencies, and manual workflows that typically power eCom growth, and offers the financing needed for DTC scale.
The Role
We are looking for a Senior Data Engineer with a strong Backend Engineering background to architect the intelligent data foundation of ZyG.
This is not a traditional ETL role. You will build the critical infrastructure that feeds our Agentic Growth Engine, enabling AI agents to perceive the market, make decisions, and drive growth. You will own the semantic layer, design the agentic data architecture (both data retrieval and training infrastructure), and build sophisticated extraction processes that allow our system to "read" the competitive landscape.
What You'll Do
- Architect the Agentic Data Infrastructure: Design and maintain the core data pipelines and storage systems (GCP/BigQuery/Postgres) that power our AI agents, ensuring high availability and low latency for decision-making.
- Build the RAG & ML Backbone: manage the infrastructure for training data, vector search, and regression testing. You will ensure our agents have access to clean, context-aware data for RAG workflows.
- Develop Agentic Extraction Processes: Build complex, resilient data extraction systems (crawlers/scrapers) to map competitive landscapes and product trends, feeding raw market data into our analysis engine.
- Own the ELT & Semantic Layer: Manage the transformation of raw data into a consistent, business-ready semantic layer that serves as the "source of truth" for both analytics and AI models.
- Run Predictive Models: Operationalize and deploy predictive models on top of our data, integrating them into the core product workflow.Define Data Standards: As a senior owner, you will establish the data engineering best practices, coding standards, and architectural patterns that the rest of the engineering team will follow.
- 8+ years of experience in Backend Engineering or Data Engineering with a software-first mindset.
- Strong proficiency in Python and SQL for data manipulation and modeling.
- Experience in building high-performance services or scalable backend systems.
- Deep expertise in the Modern Data Stack: specifically GCP, BigQuery, and Airflow (or similar orchestration tools).
- AI/ML Infrastructure familiarity: Experience building or supporting infrastructure for LLMs, RAG applications, or managing vector databases.
- Data Modeling Expert: Proven ability to design complex schemas and semantic layers that simplify data access for downstream consumers.
- Architectural Ownership: You are comfortable taking a vague requirement (e.g., "map the competitive landscape") and designing the entire data lifecycle to solve it.