What are the responsibilities and job description for the Senior Golang Developer position at AI Cybersecurity Company?
About the Role
As a Senior Golang Developer, you’ll spearhead the design and implementation of highly scalable microservices and data pipelines that process petabyte-scale logs and analytics data. You’ll leverage cloud-native technologies to ensure our systems ingest, store, and query vast volumes of telemetry with sub-second latency and rock-solid reliability.
On-site work policy: Required 5 days at our San Jose, CA office
Responsibilities
- Architect for Scale: Design, build, and optimize Go microservices capable of ingesting and processing millions of events per second.
- High-Volume Data Pipelines: Develop and maintain ETL/ELT workflows into your data platform, ensuring efficient clustering, partitioning, and cost-optimized storage.
- Logging & Observability: Integrate and tune logging/observability tools for real-time aggregation, alerting, and capacity planning at massive scale.
- Cloud Deployment: Deploy and operate services on Azure, leveraging AKS, Event Hubs, Data Factory, and other native services for resilience and auto-scaling.
- Cross-Functional Collaboration: Partner with SRE, data engineering, and product teams to define SLOs/SLIs and roll out observability dashboards.
- Code Quality & Security: Lead code reviews, define best practices for maintainability, and ensure compliance with secure coding standards under high-throughput scenarios.
Qualifications
- 5 years Software Development: Proven track record building production-grade microservices in Go.
- Scale & Performance: Demonstrable experience architecting systems for high throughput (hundreds of MB/sec or millions of events/sec).
- Cloud & Containers: Strong Azure background (AKS, Event Hubs, Data Factory) plus solid Docker/Kubernetes operational know-how.
- CI/CD Tooling: Skilled with Jenkins, Spinnaker, GitHub Actions (or similar), and infrastructure as code (Terraform, ARM templates).
- APIs & Data Stores: Comfortable with REST/gRPC, message queues (Kafka, RabbitMQ), and modern datastores (NoSQL, time-series, columnar).
- Startup Mindset: Thrives in a fast-paced, collaborative environment and embraces “you build it, you run it.”
Preferred Experience
- Hands-on with Snowflake (or similar) for data warehousing: schema design, query optimization, and cost monitoring.
- Deep experience configuring and tuning Splunk (or ELK/Vector) for multitenant, high-cardinality log environments.
- Proven track record developing technology for petabyte-scale log ingestion, storage, and analytics.