Senior Sales Engineer

New Today

StreamNative, founded by the creators of Apache Pulsar, is redefining real-time data streaming. Our platform empowers organizations to process and analyze massive data streams at scale with unparalleled efficiency. At its core is the URSA engine, which seamlessly integrates Apache Pulsar and Apache Kafka, delivering unmatched compatibility and performance. By simplifying deployments and enabling direct integration with modern lakehouse architectures, URSA helps businesses innovate faster and more effectively. With our cost-efficient solutions, companies can build cutting-edge, real-time applications that drive measurable outcomes. Join us in revolutionizing data streaming and enabling organizations to thrive in a fast-paced, data-driven world. About the Role As a Senior Sales Engineer, you will be a vital technical leader in our sales process, empowering companies to successfully adopt our cutting-edge streaming technology. You\'ll design solutions, lead product demonstrations, and guide customers through proofs-of-concept for streaming data architectures. Beyond core pre-sales activities, you\'ll play a crucial role in enabling our customers through training and ensuring product quality through rigorous testing. You will also act as a developer advocate, sharing your expertise and fostering a strong community around our products, and serve as a key trainer, developing and delivering educational content to maximize customer success. You\'ll collaborate with Product Management and Engineering teams, providing essential customer feedback to drive continuous product improvements. The right candidate is a self-starter, who is proactive and able to juggle multiple priorities. Responsibilities:
Partner with Sales to uncover technical requirements and map customer use cases to our platform. Design scalable, real-time architectures for event streaming, ingestion, and processing. Advise customers on data modeling, partitioning strategies, schema management, and storage configuration. Benchmark performance (e.g., latency, throughput) and resolve technical objections. Simulate workloads using tools like kafka-producer-perf-test, pulsar-perf, or custom scripts. Advise prospects and customers on how to integrate with existing systems: databases, APIs, BI tools, etc., and help teams move from batch ETL to stream processing. Collaborate with Product and Engineering to relay feedback from the field. Engage with the Product Management team as a customer advocate for product features and usability. Support RFP responses and help customers meet compliance and security requirements. Lead hands-on demos and proofs-of-concept using technologies like Pulsar, Kafka, Flink, or Spark. Prepare and conduct live demos for customer engagements. Present the latest StreamNative features at conferences, typically with live demos. Facilitate live and self-paced online trainings. Conduct office hours and troubleshoot example code with customers as needed. Maintain reusable demo environments and technical collateral for common industry use cases (IoT, CDC, log analytics, etc.). Prepare recording environments and record marketing demo and developer user content for StreamNative and StreamNative Academy YouTube Channels. Create and assist in maintaining technical documentation, enablement guides, and other written content to support customers, internal teams, and the developer community. Test StreamNative products and features across all development and release stages, from early engineering builds to validating in staging and production, including ecosystem integration with external databases. Provide Engineering with feedback to help drive improvements. Maintain flexible working hours to collaborate with global teams and customers in multiple time zones.
Qualifications:
5+ years of experience as a Sales Engineer, Solutions Architect, or equivalent role in a technical sales capacity. 3+ years of experience in data streaming, analytics, or event/messaging space. Strong programming background with familiarity in Java and Python, and comfortable with any modern programming language. Proficiency with Kubernetes and cloud platforms (AWS, GCP, Azure). Hands-on experience with technologies like Apache Kafka and/or Apache Pulsar. Familiarity with stream processing frameworks such as Apache Flink, ksqlDB, or Spark Streaming. Knowledge of data serialization formats (Avro, Protobuf, JSON) and schema evolution best practices. Ability to spin up demo environments using Docker, Kubernetes, and Terraform. Basic scripting or development experience in Python, Java, Go, or Bash. Strong communication and presentation skills; ability to translate deep technical concepts into business value.
Nice to Have’s:
Familiarity with developer kits like Google ADK, MCP Servers and AI coding assistants like Cursor, Copilot, or Gemini CLI. Experience with Kafka Connect, Pulsar IO, or Debezium. Background in real-time analytics, observability pipelines, or change data capture (CDC). Prior experience working with schema registries, Prometheus, Grafana, or OpenTelemetry.
Compensation: The salary range for this position is $120,000.00 - $170,000.00 per year, depending on experience and location. Additional compensation may include performance-based bonuses, commission, and equity options. Equal Opportunity Employer: We are an equal-opportunity employer and make employment decisions based on qualifications and merit. We do not discriminate on the basis of race, color, religion, gender, gender identity, sexual orientation, national origin, age, disability, or any other characteristic protected by law. Seniority level
Mid-Senior level
Employment type
Full-time
Industries
Software Development
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI. #J-18808-Ljbffr
Location:
San Francisco, CA, United States
Job Type:
FullTime