What are the responsibilities and job description for the Lead Data Engineer position at BizTek People, Inc. | APA International Placement Consultants?
WHAT ARE WE LOOKING FOR
We are seeking a hard-working Lead Data
Engineer an important role in the development of Organization Enterprise Data
& Analytics capabilities! You will be a member of an organization focusing
on the development and delivery of data solutions and capabilities. The
successful candidate is motivated by thinking big data, technically proficient,
and enjoys working in a fast-paced environment.
WHAT WILL YOU WORK ON
As a Lead Data Engineer, you will drive work
to completion with hands-on development responsibilities, and partner with the
Engineering Manager to provide thought leadership and innovation.
IN THIS ROLE, YOU WILL:
· Provide technical leadership towards
architecting and delivering end to end solutions.
· Work together with Product managers
and partners for prioritization, impact assessment and resolution.
· Driving and ensuring the team's work
are of the highest quality and on time.
· Perform design, code and test plan
reviews developed by other data engineers in support of maintaining data
engineering standards.
· Partnering with other teams for cross
functional initiatives.
· Craft and build reusable components, frameworks,
and libraries at scale to support analytics products.
· Design and implement product features
in collaboration with business and Technology partners.
· Develop architecture and design
patterns to process and store high volume data sets.
· Identify and tackle issues concerning
data management to improve data quality.
· Collaborate on the implementation of
new data management projects and re-structure of the current data architecture.
Implement automated workflows and routines using workflow scheduling tools.
· Build continuous integration,
test-driven development, and production deployment frameworks.
· Troubleshoot data issues and perform
root cause analysis to proactively resolve product and operational issues. Participate
in an Agile / Scrum methodology to deliver high-quality software releases every
2 weeks through Sprints.
WHO WILL YOU WORK WITH
Requirements
WHAT YOU BRING
We are looking for best-in-class talent that
brings the following key skills and experience to this role:
· Minimum Bachelor's degree in IT or
related field.
· One of the following alternatives may
be accepted: PhD or Law 3 years; Masters 4 years; Associates 6 years;
High School 7 years exp.
· 7 years' of experience with detailed
knowledge of data warehouse technical architectures, infrastructure components,
ETL/ ELT, and reporting/analytic tools.
· 3 years’ experience in Databricks
Lakehouse Platform
· 4 years’ experience in Big Data stack
environments (Hadoop, SPARK, Hive& Delta Lake)
· 4 years' working with multiple file
formats (Parque, Avro, Delta Lake) & API
· 3 years' experience in cloud
environments like AWS (Serverless technologies like AWS Lambda, API Gateway,
NoSQL like Dynamo, EMR & S3)
· Experience with relational and
non-relational SQL
· Proven experience in coding languages
like Python, Scala & Java
· Have experience in building real-time
streaming data pipelines.
· Experience in pub/sub modes like Kafka
· Solid grasp of data structures and
algorithms
· Experience building lambda, kappa,
microservice and batch architecture.
· Experience working on CI/CD processes
and source control tools such as GitHub and related dev processes