What are the responsibilities and job description for the DataOps Senior Specialist position at HTC Global Services?
HTC Global Services wants you. Come build new things with us and advance your career. At HTC Global you'll collaborate with experts. You'll join successful teams contributing to our clients' success. You'll work side by side with our clients and have long-term opportunities to advance your career with the latest emerging technologies.
At HTC Global Services our consultants have access to a comprehensive benefits package. Benefits can include Paid-Time-Off, Paid Holidays, 401K matching, Life and Accidental Death Insurance, Short & Long Term Disability Insurance, and a variety of other perks.
Job Description
We are seeking a highly skilled and experienced Senior DataOps Engineer to join our EPEO DataOps team. This role will be pivotal in designing, building, and maintaining robust, scalable, and secure telemetry data pipelines on Google Cloud Platform (GCP). The ideal candidate will have a strong background in DataOps principles, deep expertise in GCP data services, and a solid understanding of IT operations, especially within the security and network domains. You will enable real-time visibility and actionable insights for our security and network operations centers, contributing directly to our operational excellence and threat detection capabilities.
Key Responsibilities
At HTC Global Services our consultants have access to a comprehensive benefits package. Benefits can include Paid-Time-Off, Paid Holidays, 401K matching, Life and Accidental Death Insurance, Short & Long Term Disability Insurance, and a variety of other perks.
Job Description
We are seeking a highly skilled and experienced Senior DataOps Engineer to join our EPEO DataOps team. This role will be pivotal in designing, building, and maintaining robust, scalable, and secure telemetry data pipelines on Google Cloud Platform (GCP). The ideal candidate will have a strong background in DataOps principles, deep expertise in GCP data services, and a solid understanding of IT operations, especially within the security and network domains. You will enable real-time visibility and actionable insights for our security and network operations centers, contributing directly to our operational excellence and threat detection capabilities.
Key Responsibilities
- Design & Development: Lead the design, development, and implementation of high-performance, fault-tolerant telemetry data pipelines for ingesting, processing, and transforming large volumes of IT operational data (logs, metrics, traces) from diverse sources, with a focus on security and network telemetry.
- GCP Ecosystem Management: Architect and manage data solutions using a comprehensive suite of GCP services, ensuring optimal performance, cost-efficiency, and scalability. This includes leveraging services like Cloud Pub/Sub for messaging, Dataflow for real-time and batch processing, BigQuery for analytics, Cloud Logging for log management, and Cloud Monitoring for observability.
- DataOps Implementation: Drive the adoption and implementation of DataOps best practices, including automation, CI/CD for data pipelines, version control (e.g., Git), automated testing, data quality checks, and robust monitoring and alerting.
- Security & Network Focus: Develop specialized pipelines for critical security and network data sources such as VPC Flow Logs, firewall logs, intrusion detection system (IDS) logs, endpoint detection and response (EDR) data, and Security Information and Event Management (SIEM) data (e.g., Google Security Operations / Chronicle).
- Data Governance & Security: Implement and enforce data governance, compliance, and security measures, including data encryption (at rest and in transit), access controls (RBAC), data masking, and audit logging to protect sensitive operational data.
- Performance Optimization: Continuously monitor, optimize, and troubleshoot data pipelines for performance, reliability, and cost-effectiveness, identifying and resolving bottlenecks.
- Collaboration & Mentorship: Collaborate closely with IT operations, security analysts, network engineers, and other data stakeholders to understand data requirements and deliver solutions that meet business needs. Mentor junior engineers and contribute to the team's technical growth.
- Documentation: Create and maintain comprehensive documentation for data pipelines, data models, and operational procedures.
- Data Analysis, Cloud Infrastructure, Data Governance, Data Modeling, Network Security, Data Warehousing, Data Acquisition, Python, Data Conversion
- Technical Communication, Troubleshooting (Problem Solving), Technologies, Critical Thinking, Performance Tuning, Cross-functional
- Senior Specialist Exp: 7 experience in relevant field.
- Bachelor's or Master's degree in Computer Science, Data Engineering, Information Technology, or a related quantitative field.
- Typically, 8 years of experience in data engineering, with at least 4 years in a Senior or Lead role focused on DataOps or cloud-native data platforms.
- Master's Degree, Bachelor's Degree