Tel Aviv, Israel

Senior Data Engineer

About Fabric

Fabric enables retailers and brands to profitably scale their online business with fast fulfillment and a new kind of delivery experience. By leveraging innovative software and robotics and placing flexible micro-fulfillment centers close to where customers are, Fabric helps businesses meet even 1-hour delivery standards. Better yet, with Fabric’s powerful technology, businesses can deliver an engaging, branded experience that helps strengthen their customer relationships.

Founded in 2015, Fabric has raised $338 million to date and is backed by Aleph, Corner Ventures, Canada Pension Plan Investment Board (CPPIB), Evolv (Kraft Heinz), Innovation Endeavors, La Maison, Playground Ventures, and Temasek. With offices in New York City, Atlanta and Tel Aviv, Fabric is constantly growing with over 200 team members globally and 20 sites under development/contract, including four live micro-fulfillment centers. 

Fabric continues its rapid expansion and plans to continue rolling out its operations in key urban locations, as it realizes its mission, to bring brands and online shoppers closer.

We’re a fast-growing, ultra-collaborative company full of brilliant minds and hard workers. We work in what (we think, anyway) is the most interesting industry on the planet, and we’re solving complex problems that we can all relate to as consumers on a daily basis

The Role

We are looking for a Sr. Data Engineer to join our Data team in Israel.

Your Team

Fabric is revolutionizing on-demand e-commerce by solving fulfillment challenges through robotics, automation and artificial intelligence (AI) micro fulfillment centers (MFC) operation and service. Fabric Data team provides data and insight to make the entire process more effective and efficient. We enable our frontline team, business and client to make better decisions and empower our customers to deliver exceptional e-commerce experience to their customers.  

Your Position

As Sr. Data Engineer, you will be part of the Fabric global Data team, reporting to the Director of Data Engineering. You will collaborate with robotics, product, engineering, operation and other internal stakeholders to design and shape data platforms and data products for our internal stakeholders and external customers. You will closely work with other data engineers, BI engineers and analysts to design, develop and implement modern data product solutions, including ETL, data warehouse, data modeling and data delivery. You will be guided and mentored by senior team members and your leader.

What You’ll Do

  • Work with your team and stakeholders to understand functional requirements and turn that into technical solutions.
  • Design the technical solution, develop and implement using Snowflake, Python and visualization technologies.
  • Build the data delivery layers (Share, Data Exchange, API, Visualization (using open source technology) and Web) on or from Snowflake.
  • Explore, advise on and design future data product solutions.
  • Contribute to data product strategy and roadmap.
  • Coach and mentor team members.
  • Support the data engineering needs for legacy platform.
  • Communicate data platform and data product value to stakeholders and partners.
  • Document information and promote best practices.


Who You Are

  • Essential (Must have).
  • 8+ years of experience in data engineering and data product design and development.
  • Hands-on experience building large scale data warehouse, preferably on Snowflake, following Kimbal or Data Vault modeling approach.
  • 3+ years of experience in architecting cloud-based data platform and solution.
  • 2+ year of work experience on cloud (GCP preferred) platform.
  • 4+ years (including current project) of hands-on experience developing data solution in Python or Java.
  • Very strong SQL and database fundamentals understanding and experience.
  • Possesses admirable work ethic, integrity and enthusiasm.


  • Designed and built data pipeline downstream of Kafka.
  • Experience with data build tool (dbt) ETL tool.
  • Experience with workflow orchestration tool (e.g. Airflow/Prefect/Dagster).
  • Experience designing and implementing real-time streaming analytics solutions.
  • Experience working in fast-paced outcome-driven start-up work environment.
  • Hands on experience in Snowflake.
  • Experience working with Sinsense or similar BI tool.
  • Experience in building Microservice-based cloud-native applications.
  • Experience in e-commerce fulfillment operation.