What are the responsibilities and job description for the GCP Data Integration Engineer position at VeriiPro?
Job Description
- Design, develop, and optimize scalable data pipelines using Apache NiFi and GCP Dataflow.
- Support the integration and migration of legacy data workflows and services to Google Cloud Platform (GCP).
- Build event streaming solutions using Kafka or GCP Pub/Sub.
- Develop REST APIs using Spring Boot or Apache Camel for seamless data access and integration.
- Work with a variety of storage systems including RDBMS, NoSQL, document stores, and search indices.
- Leverage cloud-native tools (Docker, Kubernetes) to support application development and deployment.
- Ensure high system reliability with instrumentation and monitoring tools such as Dynatrace and Splunk.
- Participate in Agile ceremonies and contribute to CI/CD practices using Jenkins, GitHub, and XLR.
- Bachelor’s degree in Computer Science, Engineering, or a related technical field, or equivalent professional experience.
- Proven experience with GCP, AWS, or other cloud-based big data platforms.
- Experience with Hadoop, MapR, Hive, Spark, and shell scripting.
- Solid understanding of distributed system architectures and cloud cluster configuration.
- Strong programming skills, with a preference for Java and OOP fundamentals (SOLID principles and design patterns).
- Proficiency with SQL and HSQL, and managing diverse storage technologies.
- Hands-on experience with containerization and orchestration tools such as Docker and Kubernetes.
- Familiarity with source control and CI/CD workflows (Jenkins, GitHub, XLR).
- Comfortable working in Agile development environments.