Senior DataOps Engineer
Dive right in. Swim with our pod.
At Orca, in the right environment and with the right team, talent has no boundaries. This team spirit, together with our drive to always aim high, has quickly earned us unicorn status and turned us into a global cloud security innovation leader. So if you’re ready to join an amazing team of people who inspire each other every day, now is the time to find your place in our pod.
We’re looking for driven and talented people like you to join our R&D team and our mission to change the future of cloud security. Ready to dive in and swim with our pod?
- High-growth: In just 4 years, we’ve reached milestones that take other companies a decade or more. We’ve doubled our employee count, tripled our customer count, and rapidly expanded our product capabilities.
- Disruptive innovation: Our founders saw that traditional security didn’t work for the cloud—so they set out to carve a new path. We’re relentless pioneers who invented agentless technology and continue to be the most comprehensive and innovative cloud security company.
- Well-capitalized: With a valuation of $1.8 billion, Orca is a cybersecurity unicorn dominating the cloud security space. We’re backed by an impressive team of investors such as Capital G, ICONIQ, GGV, and SVCI, a syndicate of CISOs who invest their own money after conducting their due diligence.
- Respectful and transparent culture: Our executives pride themselves on being accessible to everyone and believe in sharing knowledge with the employees. Each employee has a place in shaping the future of our industry.
About the role:
We are seeking a talented and motivated DataOps Engineer to join our dynamic platform engineering team. As a DataOps Engineer focused on data streaming and datastore solutions, you will play a crucial role in designing, implementing, and maintaining robust, scalable, and efficient data pipelines and storage systems. Your primary responsibility will be to bridge the gap between data engineering and DevOps, ensuring seamless integration of data workflows with our cloud-based infrastructure and tools.
On a typical day, you'll:
- Apply DevOps principles to data engineering workflows to enable seamless integration, automation, and version control of data pipelines and related infrastructure.
- Leverage Terraform and Kubernetes to automate the provisioning and orchestration of data-related resources and containers.
- Implement monitoring and alerting mechanisms to proactively identify data-related issues and performance bottlenecks.
- Optimize data storage solutions for performance, cost, and data security.
- Maintain detailed documentation of data infrastructure, streaming pipelines, and data workflows.
- Collaborate with cross-functional teams, including data engineers, data scientists, and business stakeholders, to understand data requirements and provide technical expertise.
- A minimum of 3+ years’ experience as a DataOps Engineer, Data Infra Engineer, or related role with a focus on data streaming and data storage solutions in public clouds, AWS preferred.
- Experience in managing data-intensive solutions like Kafka, Redis, Elasticsearch, MongoDB, SingleStore, Airflow, etc.
- Hands-on experience with infrastructure-as-code tools like Terraform and container orchestration platforms like Kubernetes.
- Solid understanding of DevOps principles and experience applying them in data engineering workflows.
- Excellent problem-solving skills and the ability to troubleshoot complex data-related issues.
- Strong communication skills and the ability to work effectively in a collaborative team environment.