Back

Senior Data Engineer

Israel (TLV Office)

About us

Oligo is a rapidly growing startup headquartered in Tel Aviv, leading the way in reshaping Application Security. With a strong investment from top-tier VCs including Lightspeed, Ballistic Ventures, and TLV Partners, we are developing a unique solution to address open-source security challenges.

Our innovative technology leverages runtime application context while maintaining exceptional performance and stability. The Oligo Application Defense Platform gives security teams powerful capabilities to observe application components, detect intrusions, and mitigate threats, all while keeping developers focused on features - not fixes.

What you’ll be doing

We are looking for a Senior Data Engineer to join our Application Detection and Response (ADR) squad. This multidisciplinary team, comprised of developers and security researchers, is responsible for developing Oligo’s defense platform for real-time threat detection. The squad handles all aspects of the product, from researching security algorithms to full-stack platform development.

As a Senior Data Engineer in the squad, you will be responsible for architecting, building, and maintaining our data infrastructure, and ensuring it is scalable, efficient, and reliable.

Your role will be crucial to maintaining the core data platform that powers security insights and threat detection across Oligo’s customer environments. We expect you to be technically proficient, capable of leading projects, and skilled at working collaboratively across teams.
You will:

  • Lead Data Architecture of the Product: Own complex data projects end-to-end, working closely with cross-functional teams. Actively drive decision-making and project execution, ensuring timely delivery.
  • Design & Develop Scalable Data Pipelines: Build robust, efficient data pipelines using modern technologies like Databricks, Delta Lake, and Spark, ensuring smooth data processing. Design scalable data schemas, leading discussions on data modeling and ensuring best practices are followed.

Qualifications

  • 3+ years of experience in software engineering, or similar roles, with a strong track record of designing, building, and maintaining scalable and reliable systems. Preferred Languages: Go, Python.
  • 3+ years of experience in designing and building large-scale and operational data pipelines from scratch, specifically using Spark.
  • Experience in core data engineering aspects, including data modeling, data warehousing solutions such as BigQuery, ClickHouse and Databricks, and modern data storage frameworks such as Delta Lake and Iceberg.
  • In-depth knowledge of optimization techniques such as data partitioning, clustering, data skipping, indexing, and query optimization.
  • Experience with AWS and other cloud providers.

Required Qualities

  • Proactive Leadership: We need someone who can drive decisions. We expect you to demonstrate technical leadership in your domain, initiate discussions and help steer our projects in the right direction.
  • Ownership Mindset: You should take full ownership of your work, ensuring quality, reliability, and delivery without needing constant supervision.
  • Excellent Communication and Teamwork: Communicate complex technical concepts clearly, review code thoroughly, and collaborate effectively with stakeholders.
  • Efficiency & Quality: Strive for high-quality code, documentation, and communication, while ensuring you are efficient and can deliver quickly under pressure.
Apply your CV