What are the responsibilities and job description for the Senior Data Engineer position at SoTalent?
Job Title: Senior Data Engineer
Location: Richmond, Virginia (Remote)
Type: Full time
We are seeking top-tier talent to join our pioneering data engineering team and help shape the future of data-driven innovation.
As a Data Engineer, you’ll collaborate with forward-thinking leaders in product, technology, and design who constantly challenge convention and build transformative tools from the ground up. You'll be at the forefront of delivering data-powered solutions that drive meaningful user experiences and business outcomes.
What You’ll Do:
- Build Scalable Solutions: Collaborate with Agile teams to design, develop, and support cloud-native data platforms and applications using modern full-stack technologies.
- Engineer with Impact: Work alongside specialists in machine learning and distributed systems to create robust data pipelines, APIs, and infrastructure to support advanced analytics and real-time applications.
- Write High-Quality Code: Develop clean, maintainable, and performant code using languages such as Java, Scala, Python, and SQL.
- Utilize Cloud Platforms: Leverage cloud-based tools and services including Databricks, Snowflake, and offerings from AWS, GCP, or Azure.
- Contribute to Technical Growth: Stay ahead of technology trends, actively participate in internal and external tech communities, and mentor team members.
- Ensure Quality Delivery: Perform unit testing, peer code reviews, and performance optimization to ensure scalable and resilient systems.
- Drive Real-Time Solutions: Develop and maintain data solutions built for streaming and real-time processing architectures.
Basic Qualifications:
- Bachelor’s degree in Computer Science or a related field
- Minimum of 3 years of experience in application development (excluding internships)
- At least 1 year of experience with big data technologies
Preferred Qualifications:
- 5 years of experience developing in Python, SQL, Scala, or Java
- 2 years working with a major cloud provider (AWS, Azure, or GCP)
- 3 years of hands-on experience with distributed computing tools like Spark, Hadoop, Hive, Kafka, MapReduce, or EMR
- 2 years of experience with real-time or streaming data systems
- Experience with NoSQL databases such as MongoDB or Cassandra
- Experience with data warehousing platforms like Databricks or Snowflake
- 3 years of experience in Linux/UNIX environments, including shell scripting
- Strong background in Agile engineering practices
This opportunity is ideal for engineers who are passionate about building intelligent, cloud-based systems and making data work at scale. If you're ready to innovate and build the infrastructure that powers the future, we want to hear from you.